Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model
NASA Astrophysics Data System (ADS)
Yuan, Zhongda; Deng, Junxiang; Wang, Dawei
2018-02-01
Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.
1981-12-01
CONCERNING THE RELIABILITY OF A SYSTEM MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS AFIT/GOR/MA/81D-8 Philippe A. Lussier 2nd Lt USAF... MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology...repetitions are used for these test procedures. vi Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter
Two-sided Topp-Leone Weibull distribution
NASA Astrophysics Data System (ADS)
Podeang, Krittaya; Bodhisuwan, Winai
2017-11-01
In this paper, we introduce a general class of lifetime distributions, called the two-sided Topp-Leone generated family of distribution. A special case of new family is the two-sided Topp-Leone Weibull distribution. This distribution used the two-sided Topp-Leone distribution as a generator for the Weibull distribution. The two-sided Topp-Leone Weibull distribution is presented in several shapes of distributions such as decreasing, unimodal, and bimodal which make this distribution more than flexible than the Weibull distribution. Its quantile function is presented. The parameter estimation method by using maximum likelihood estimation is discussed. The proposed distribution is applied to the strength data set, remission times of bladder cancer patients data set and time to failure of turbocharger data set. We compare the proposed distribution to the Topp-Leone Generated Weibull distribution. In conclusion, the two-sided Topp-Leone Weibull distribution performs similarly as the Topp-Leone Generated Weibull distribution in the first and second data sets. However, the proposed distribution can perform better than fit to Topp-Leone Generated Weibull distribution for the other.
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
NASA Technical Reports Server (NTRS)
Shantaram, S. Pai; Gyekenyesi, John P.
1989-01-01
The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2014-01-01
Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...
An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
ERIC Educational Resources Information Center
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Gyekenyesi, John P.
1988-01-01
The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
The Effect of Roughness Model on Scattering Properties of Ice Crystals.
NASA Technical Reports Server (NTRS)
Geogdzhayev, Igor V.; Van Diedenhoven, Bastiaan
2016-01-01
We compare stochastic models of microscale surface roughness assuming uniform and Weibull distributions of crystal facet tilt angles to calculate scattering by roughened hexagonal ice crystals using the geometric optics (GO) approximation. Both distributions are determined by similar roughness parameters, while the Weibull model depends on the additional shape parameter. Calculations were performed for two visible wavelengths (864 nm and 410 nm) for roughness values between 0.2 and 0.7 and Weibull shape parameters between 0 and 1.0 for crystals with aspect ratios of 0.21, 1 and 4.8. For this range of parameters we find that, for a given roughness level, varying the Weibull shape parameter can change the asymmetry parameter by up to about 0.05. The largest effect of the shape parameter variation on the phase function is found in the backscattering region, while the degree of linear polarization is most affected at the side-scattering angles. For high roughness, scattering properties calculated using the uniform and Weibull models are in relatively close agreement for a given roughness parameter, especially when a Weibull shape parameter of 0.75 is used. For smaller roughness values, a shape parameter close to unity provides a better agreement. Notable differences are observed in the phase function over the scattering angle range from 5deg to 20deg, where the uniform roughness model produces a plateau while the Weibull model does not.
Quang V. Cao; Shanna M. McCarty
2006-01-01
Diameter distributions in a forest stand have been successfully characterized by use of the Weibull function. Of special interest are cases where parameters of a Weibull distribution that models a future stand are predicted, either directly or indirectly, from current stand density and dominant height. This study evaluated four methods of predicting the Weibull...
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
Use of the Weibull function to predict future diameter distributions from current plot data
Quang V. Cao
2012-01-01
The Weibull function has been widely used to characterize diameter distributions in forest stands. The future diameter distribution of a forest stand can be predicted by use of a Weibull probability density function from current inventory data for that stand. The parameter recovery approach has been used to ârecoverâ the Weibull parameters from diameter moments or...
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Steve P. Verrill; Frank C. Owens; David E. Kretschmann; Rubin Shmulsky
2017-01-01
It is common practice to assume that a two-parameter Weibull probability distribution is suitable for modeling lumber properties. Verrill and co-workers demonstrated theoretically and empirically that the modulus of rupture (MOR) distribution of visually graded or machine stress rated (MSR) lumber is not distributed as a Weibull. Instead, the tails of the MOR...
NASA Astrophysics Data System (ADS)
Sanford, W. E.
2015-12-01
Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially better fit to the data than the one-parameter exponential function. For the single porosity system it was found that the use of three parameters was often optimal for accurately describing the base-flow age distribution, whereas for the dual porosity system the fourth parameter was often required to fit the more complicated response curves.
Reliability analysis of structural ceramic components using a three-parameter Weibull distribution
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois
1992-01-01
Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.
The distribution of first-passage times and durations in FOREX and future markets
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting time. We find that our distribution is applicable as long as durations follow a Weibull law for short times and do not have too heavy a tail.
Maximum likelihood estimates, from censored data, for mixed-Weibull distributions
NASA Astrophysics Data System (ADS)
Jiang, Siyuan; Kececioglu, Dimitri
1992-06-01
A new algorithm for estimating the parameters of mixed-Weibull distributions from censored data is presented. The algorithm follows the principle of maximum likelihood estimate (MLE) through the expectation and maximization (EM) algorithm, and it is derived for both postmortem and nonpostmortem time-to-failure data. It is concluded that the concept of the EM algorithm is easy to understand and apply (only elementary statistics and calculus are required). The log-likelihood function cannot decrease after an EM sequence; this important feature was observed in all of the numerical calculations. The MLEs of the nonpostmortem data were obtained successfully for mixed-Weibull distributions with up to 14 parameters in a 5-subpopulation, mixed-Weibull distribution. Numerical examples indicate that some of the log-likelihood functions of the mixed-Weibull distributions have multiple local maxima; therefore, the algorithm should start at several initial guesses of the parameter set.
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2012-01-01
Two important wood properties are stiffness (modulus of elasticity or MOE) and bending strength (modulus of rupture or MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two or three parameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of...
Rafal Podlaski; Francis A. Roesch
2013-01-01
Study assessed the usefulness of various methods for choosing the initial values for the numerical procedures for estimating the parameters of mixture distributions and analysed variety of mixture models to approximate empirical diameter at breast height (dbh) distributions. Two-component mixtures of either the Weibull distribution or the gamma distribution were...
Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.
2012-01-01
A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).
NASA Astrophysics Data System (ADS)
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
An evaluation of percentile and maximum likelihood estimators of weibull paremeters
Stanley J. Zarnoch; Tommy R. Dell
1985-01-01
Two methods of estimating the three-parameter Weibull distribution were evaluated by computer simulation and field data comparison. Maximum likelihood estimators (MLB) with bias correction were calculated with the computer routine FITTER (Bailey 1974); percentile estimators (PCT) were those proposed by Zanakis (1979). The MLB estimators had superior smaller bias and...
Statistical distribution of mechanical properties for three graphite-epoxy material systems
NASA Technical Reports Server (NTRS)
Reese, C.; Sorem, J., Jr.
1981-01-01
Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.
Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams
NASA Astrophysics Data System (ADS)
Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.
2017-07-01
This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.
NASA Technical Reports Server (NTRS)
Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.
1990-01-01
This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.
NASA Astrophysics Data System (ADS)
Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi
To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.
On alternative q-Weibull and q-extreme value distributions: Properties and applications
NASA Astrophysics Data System (ADS)
Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin
2018-01-01
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.
1978-03-01
for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented
Steve P. Verrill; David E. Kretschmann; James W. Evans
2016-01-01
Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...
Mixture distributions of wind speed in the UAE
NASA Astrophysics Data System (ADS)
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.
ZERODUR: deterministic approach for strength design
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2012-12-01
There is an increasing request for zero expansion glass ceramic ZERODUR substrates being capable of enduring higher operational static loads or accelerations. The integrity of structures such as optical or mechanical elements for satellites surviving rocket launches, filigree lightweight mirrors, wobbling mirrors, and reticle and wafer stages in microlithography must be guaranteed with low failure probability. Their design requires statistically relevant strength data. The traditional approach using the statistical two-parameter Weibull distribution suffered from two problems. The data sets were too small to obtain distribution parameters with sufficient accuracy and also too small to decide on the validity of the model. This holds especially for the low failure probability levels that are required for reliable applications. Extrapolation to 0.1% failure probability and below led to design strengths so low that higher load applications seemed to be not feasible. New data have been collected with numbers per set large enough to enable tests on the applicability of the three-parameter Weibull distribution. This distribution revealed to provide much better fitting of the data. Moreover it delivers a lower threshold value, which means a minimum value for breakage stress, allowing of removing statistical uncertainty by introducing a deterministic method to calculate design strength. Considerations taken from the theory of fracture mechanics as have been proven to be reliable with proof test qualifications of delicate structures made from brittle materials enable including fatigue due to stress corrosion in a straight forward way. With the formulae derived, either lifetime can be calculated from given stress or allowable stress from minimum required lifetime. The data, distributions, and design strength calculations for several practically relevant surface conditions of ZERODUR are given. The values obtained are significantly higher than those resulting from the two-parameter Weibull distribution approach and no longer subject to statistical uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.
1993-01-01
New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.
Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...
2016-02-02
Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less
Study on constant-step stress accelerated life tests in white organic light-emitting diodes.
Zhang, J P; Liu, C; Chen, X; Cheng, G L; Zhou, A X
2014-11-01
In order to obtain reliability information for a white organic light-emitting diode (OLED), two constant and one step stress tests were conducted with its working current increased. The Weibull function was applied to describe the OLED life distribution, and the maximum likelihood estimation (MLE) and its iterative flow chart were used to calculate shape and scale parameters. Furthermore, the accelerated life equation was determined using the least squares method, a Kolmogorov-Smirnov test was performed to assess if the white OLED life follows a Weibull distribution, and self-developed software was used to predict the average and the median lifetimes of the OLED. The numerical results indicate that white OLED life conforms to a Weibull distribution, and that the accelerated life equation completely satisfies the inverse power law. The estimated life of a white OLED may provide significant guidelines for its manufacturers and customers. Copyright © 2014 John Wiley & Sons, Ltd.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Prediction of Mean and Design Fatigue Lives of Self Compacting Concrete Beams in Flexure
NASA Astrophysics Data System (ADS)
Goel, S.; Singh, S. P.; Singh, P.; Kaushik, S. K.
2012-02-01
In this paper, result of an investigation conducted to study the flexural fatigue characteristics of self compacting concrete (SCC) beams in flexure are presented. An experimental programme was planned in which approximately 60 SCC beam specimens of size 100 × 100 × 500 mm were tested under flexural fatigue loading. Approximately 45 static flexural tests were also conducted to facilitate fatigue testing. The flexural fatigue and static flexural strength tests were conducted on a 100 kN servo-controlled actuator. The fatigue life data thus obtained have been used to establish the probability distributions of fatigue life of SCC using two-parameter Weibull distribution. The parameters of the Weibull distribution have been obtained by different methods of analysis. Using the distribution parameters, the mean and design fatigue lives of SCC have been estimated and compared with Normally vibrated concrete (NVC), the data for which have been taken from literature. It has been observed that SCC exhibits higher mean and design fatigue lives compared to NVC.
pT spectra in pp and AA collisions at RHIC and LHC energies using the Tsallis-Weibull approach
NASA Astrophysics Data System (ADS)
Dash, Sadhana; Mahapatra, D. P.
2018-04-01
The Tsallis q -statistics have been incorporated in the Weibull model of particle production, in the form of q-Weibull distribution, to describe the transverse momentum (pT) distribution of charged hadrons at mid-rapidity, measured at RHIC and LHC energies. The q-Weibull distribution is found to describe the observed pT distributions over all ranges of measured pT. Below 2.2 GeV/c, while going from peripheral to central collisions, the parameter q is found to decrease systematically towards unity, indicating an evolution from a non-equilibrated system in peripheral collisions, towards a more thermalized system in central collisions. However, the trend is reversed in the all inclusive pT regime. This can be attributed to an increase in relative contribution of hard pQCD processes in central collisions. The λ-parameter is found to be associated with the mean pT or the collective expansion velocity of the produced hadrons, which shows an expected increase with centrality of collisions. The k parameter is observed to increase with the onset of hard QCD scatterings, initial fluctuations, and other processes leading to non-equilibrium conditions.
Time-dependent reliability analysis of ceramic engine components
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.
Weibull mixture regression for marginal inference in zero-heavy continuous outcomes.
Gebregziabher, Mulugeta; Voronca, Delia; Teklehaimanot, Abeba; Santa Ana, Elizabeth J
2017-06-01
Continuous outcomes with preponderance of zero values are ubiquitous in data that arise from biomedical studies, for example studies of addictive disorders. This is known to lead to violation of standard assumptions in parametric inference and enhances the risk of misleading conclusions unless managed properly. Two-part models are commonly used to deal with this problem. However, standard two-part models have limitations with respect to obtaining parameter estimates that have marginal interpretation of covariate effects which are important in many biomedical applications. Recently marginalized two-part models are proposed but their development is limited to log-normal and log-skew-normal distributions. Thus, in this paper, we propose a finite mixture approach, with Weibull mixture regression as a special case, to deal with the problem. We use extensive simulation study to assess the performance of the proposed model in finite samples and to make comparisons with other family of models via statistical information and mean squared error criteria. We demonstrate its application on real data from a randomized controlled trial of addictive disorders. Our results show that a two-component Weibull mixture model is preferred for modeling zero-heavy continuous data when the non-zero part are simulated from Weibull or similar distributions such as Gamma or truncated Gauss.
Design of ceramic components with the NASA/CARES computer program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
The ceramics analysis and reliability evaluation of structures (CARES) computer program is described. The primary function of the code is to calculate the fast-fracture reliability or failure probability of macro-scopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. CARES uses results from MSC/NASTRAN or ANSYS finite-element analysis programs to evaluate how inherent surface and/or volume type flaws component reliability. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for a single or multiple failure modes by using a least-squares analysis or a maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-to-fit-tests, 90 percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan 90 percent confidence band values are also provided. Examples are provided to illustrate the various features of CARES.
Global sensitivity analysis in wind energy assessment
NASA Astrophysics Data System (ADS)
Tsvetkova, O.; Ouarda, T. B.
2012-12-01
Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.
ZERODUR strength modeling with Weibull statistical distributions
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2016-07-01
The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.
Reliability Analysis of Uniaxially Ground Brittle Materials
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.
1995-01-01
The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.
Transmission overhaul and replacement predictions using Weibull and renewel theory
NASA Technical Reports Server (NTRS)
Savage, M.; Lewicki, D. G.
1989-01-01
A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
NASA Astrophysics Data System (ADS)
Janković, Bojan
2009-10-01
The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.
Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.
NASA Technical Reports Server (NTRS)
Gross, Bernard
1996-01-01
Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.
Decision Models for Determining the Optimal Life Test Sampling Plans
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Strelchonok, Vladimir F.
2010-11-01
Life test sampling plan is a technique, which consists of sampling, inspection, and decision making in determining the acceptance or rejection of a batch of products by experiments for examining the continuous usage time of the products. In life testing studies, the lifetime is usually assumed to be distributed as either a one-parameter exponential distribution, or a two-parameter Weibull distribution with the assumption that the shape parameter is known. Such oversimplified assumptions can facilitate the follow-up analyses, but may overlook the fact that the lifetime distribution can significantly affect the estimation of the failure rate of a product. Moreover, sampling costs, inspection costs, warranty costs, and rejection costs are all essential, and ought to be considered in choosing an appropriate sampling plan. The choice of an appropriate life test sampling plan is a crucial decision problem because a good plan not only can help producers save testing time, and reduce testing cost; but it also can positively affect the image of the product, and thus attract more consumers to buy it. This paper develops the frequentist (non-Bayesian) decision models for determining the optimal life test sampling plans with an aim of cost minimization by identifying the appropriate number of product failures in a sample that should be used as a threshold in judging the rejection of a batch. The two-parameter exponential and Weibull distributions with two unknown parameters are assumed to be appropriate for modelling the lifetime of a product. A practical numerical application is employed to demonstrate the proposed approach.
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
Moody, John A.
2017-01-01
A superslug was deposited in a basin in the Colorado Front Range Mountains as a consequence of an extreme flood following a wildfire disturbance in 1996. The subsequent evolution of this superslug was measured by repeat topographic surveys (31 surveys from 1996 through 2014) of 18 cross sections approximately uniformly spaced over 1500 m immediately above the basin outlet. These surveys allowed the identification within the superslug of chronostratigraphic units deposited and eroded by different geomorphic processes in response to different flow regimes.Over the time period of the study, the superslug went through aggradation, incision, and stabilization phases that were controlled by a shift in geomorphic processes from generally short-duration, episodic, large-magnitude floods that deposited new chronostratigraphic units to long-duration processes that eroded units. These phases were not contemporaneous at each channel cross section, which resulted in a complex response that preserved different chronostratigraphic units at each channel cross section having, in general, two dominant types of alluvial architecture—laminar and fragmented. Age and transit-time distributions for these two alluvial architectures evolved with time since the extreme flood. Because of the complex shape of the distributions they were best modeled by two-parameter Weibull functions. The Weibull scale parameter approximated the median age of the distributions, and the Weibull shape parameter generally had a linear relation that increased with time since the extreme flood. Additional results indicated that deposition of new chronostratigraphic units can be represented by a power-law frequency distribution, and that the erosion of units decreases with depth of burial to a limiting depth. These relations can be used to model other situations with different flow regimes where vertical aggradation and incision are dominant processes, to predict the residence time of possible contaminated sediment stored in channels or on floodplains, and to provide insight into the interpretation of recent or ancient fluvial deposits.
NASA Astrophysics Data System (ADS)
Moody, John A.
2017-10-01
A superslug was deposited in a basin in the Colorado Front Range Mountains as a consequence of an extreme flood following a wildfire disturbance in 1996. The subsequent evolution of this superslug was measured by repeat topographic surveys (31 surveys from 1996 through 2014) of 18 cross sections approximately uniformly spaced over 1500 m immediately above the basin outlet. These surveys allowed the identification within the superslug of chronostratigraphic units deposited and eroded by different geomorphic processes in response to different flow regimes. Over the time period of the study, the superslug went through aggradation, incision, and stabilization phases that were controlled by a shift in geomorphic processes from generally short-duration, episodic, large-magnitude floods that deposited new chronostratigraphic units to long-duration processes that eroded units. These phases were not contemporaneous at each channel cross section, which resulted in a complex response that preserved different chronostratigraphic units at each channel cross section having, in general, two dominant types of alluvial architecture-laminar and fragmented. Age and transit-time distributions for these two alluvial architectures evolved with time since the extreme flood. Because of the complex shape of the distributions they were best modeled by two-parameter Weibull functions. The Weibull scale parameter approximated the median age of the distributions, and the Weibull shape parameter generally had a linear relation that increased with time since the extreme flood. Additional results indicated that deposition of new chronostratigraphic units can be represented by a power-law frequency distribution, and that the erosion of units decreases with depth of burial to a limiting depth. These relations can be used to model other situations with different flow regimes where vertical aggradation and incision are dominant processes, to predict the residence time of possible contaminated sediment stored in channels or on floodplains, and to provide insight into the interpretation of recent or ancient fluvial deposits.
Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.
ZERODUR - bending strength: review of achievements
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2017-08-01
Increased demand for using the glass ceramic ZERODUR® with high mechanical loads called for strength data based on larger statistical samples. Design calculations for failure probability target value below 1: 100 000 cannot be made reliable with parameters derived from 20 specimen samples. The data now available for a variety of surface conditions, ground with different grain sizes and acid etched for full micro crack removal, allow stresses by factors four to ten times higher than before. The large sample revealed that breakage stresses of ground surfaces follow the three parameter Weibull distribution instead of the two parameter version. This is more reasonable considering that the micro cracks of such surfaces have a maximum depth which is reflected in the existence of a threshold breakage stress below which breakage probability is zero. This minimum strength allows calculating minimum lifetimes. Fatigue under load can be taken into account by using the stress corrosion coefficient for the actual environmental humidity. For fully etched surfaces Weibull statistics fails. The precondition of the Weibull distribution, the existence of one unique failure mechanism, is not given anymore. ZERODUR® with fully etched surfaces free from damages introduced after etching endures easily 100 MPa tensile stress. The possibility to use ZERODUR® for combined high precision and high stress application was confirmed by the successful launch and continuing operation of LISA Pathfinder the precursor experiment for the gravitational wave antenna satellite array eLISA.
NASA Astrophysics Data System (ADS)
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for effective selection of equipment in the process of designing a power supply system in a certain location.
Fisher information for two gamma frailty bivariate Weibull models.
Bjarnason, H; Hougaard, P
2000-03-01
The asymptotic properties of frailty models for multivariate survival data are not well understood. To study this aspect, the Fisher information is derived in the standard bivariate gamma frailty model, where the survival distribution is of Weibull form conditional on the frailty. For comparison, the Fisher information is also derived in the bivariate gamma frailty model, where the marginal distribution is of Weibull form.
NASA Astrophysics Data System (ADS)
Pasari, S.; Kundu, D.; Dikshit, O.
2012-12-01
Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Exponential, gamma, Weibull and lognormal distributions are quite established probability models in this recurrence interval estimation. However, they have certain shortcomings too. Thus, it is imperative to search for some alternative sophisticated distributions. In this paper, we introduce a three-parameter (location, scale and shape) exponentiated exponential distribution and investigate the scope of this distribution as an alternative of the afore-mentioned distributions in earthquake recurrence studies. This distribution is a particular member of the exponentiated Weibull distribution. Despite of its complicated form, it is widely accepted in medical and biological applications. Furthermore, it shares many physical properties with gamma and Weibull family. Unlike gamma distribution, the hazard function of generalized exponential distribution can be easily computed even if the shape parameter is not an integer. To contemplate the plausibility of this model, a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (20-32 deg N and 87-100 deg E) has been used. The model parameters are estimated using maximum likelihood estimator (MLE) and method of moment estimator (MOME). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.
Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.
Khandaker, Morshed; Ekwaro-Osire, Stephen
2013-01-01
The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone.
The Effect of Sr Modifier Additions on Double Oxide Film Defects in 2L99 Alloy Castings
NASA Astrophysics Data System (ADS)
Chen, Qi; Griffiths, W. D.
2017-11-01
In this paper, Sr modifier (300 ppm) was added to 2L99 alloy sand castings to investigate its effect on bifilm defects in the castings. Two different sand molds were used in this study, with good and bad running system designs, to introduce different amounts of bifilm defects into the castings. The mechanical properties of the modified 2L99 castings were compared to the properties of unmodified castings and showed that with high bifilm defect contents (H) the Sr addition reduced the Weibull modulus of the UTS by 67 pct and the Position Parameter by 5 pct, and introduced a bimodal distribution into the Weibull plot of the pct Elongation. However, for castings with low bifilm defect content (L), the Weibull moduli of both the UTS and pct Elongation were significantly improved (by 78 and 73 pct, respectively) with the addition of Sr. The Position Parameter of the pct Elongation was improved by 135 pct. The results suggested that a desirable modification effect can only be achieved while the bifilm defect content in a casting was low.
Earthquakes: Recurrence and Interoccurrence Times
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.
2008-04-01
The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.
NASA Astrophysics Data System (ADS)
Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.
2016-01-01
Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.
Morphological study on the prediction of the site of surface slides
Hiromasa Hiura
1991-01-01
The annual continual occurrence of surface slides in the basin was estimated by modifying the estimation formula of Yoshimatsu. The Weibull Distribution Function revealed to be usefull for presenting the state and the transition of surface slides in the basin. Three parameters of the Weibull Function are recognized to be the linear function of the area ratio a/A. The...
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
Probability distribution functions for unit hydrographs with optimization using genetic algorithm
NASA Astrophysics Data System (ADS)
Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh
2017-05-01
A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.
C-Sphere Strength-Size Scaling in a Bearing-Grade Silicon Nitride
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A; Jadaan, Osama M.; Kirkland, Timothy Philip
2008-01-01
A C-sphere specimen geometry was used to determine the failure strength distributions of a commercially available bearing-grade silicon nitride (Si3N4) having ball diameters of 12.7 and 25.4 mm. Strengths for both diameters were determined using the combination of failure load, C sphere geometry, and finite element analysis and fitted using two-parameter Weibull distributions. Effective areas of both diameters were estimated as a function of Weibull modulus and used to explore whether the strength distributions predictably strength-scaled between each size. They did not. That statistical observation suggested that the same flaw type did not limit the strength of both ball diametersmore » indicating a lack of material homogeneity between the two sizes. Optical fractography confirmed that. It showed there were two distinct strength-limiting flaw types in both ball diameters, that one flaw type was always associated with lower strength specimens, and that significantly higher fraction of the 24.5-mm-diameter c-sphere specimens failed from it. Predictable strength-size-scaling would therefore not result as a consequence of this because these flaw types were not homogenously distributed and sampled in both c-sphere geometries.« less
Effect of bending on the room-temperature tensile strengths of structural ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, M.G.
1992-01-01
Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less
Effect of bending on the room-temperature tensile strengths of structural ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, M.G.
1992-07-01
Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity
Englehardt, James D.
2015-01-01
Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263
Durability evaluation of ceramic components using CARES/LIFE
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1994-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens which exhibit SCG when exposed to water.
Durability evaluation of ceramic components using CARES/LIFE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemeth, N.N.; Janosik, L.A.; Gyekenyesi, J.P.
1996-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength andmore » fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens, which exhibit SCG when exposed to water.« less
Reproducibility of structural strength and stiffness for graphite-epoxy aircraft spoilers
NASA Technical Reports Server (NTRS)
Howell, W. E.; Reese, C. D.
1978-01-01
Structural strength reproducibility of graphite epoxy composite spoilers for the Boeing 737 aircraft was evaluated by statically loading fifteen spoilers to failure at conditions simulating aerodynamic loads. Spoiler strength and stiffness data were statistically modeled using a two parameter Weibull distribution function. Shape parameter values calculated for the composite spoiler strength and stiffness were within the range of corresponding shape parameter values calculated for material property data of composite laminates. This agreement showed that reproducibility of full scale component structural properties was within the reproducibility range of data from material property tests.
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.
1985-01-01
A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
NASA Astrophysics Data System (ADS)
Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao
2018-06-01
This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.
A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil
Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo
2014-01-01
This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909
Weibull crack density coefficient for polydimensional stress states
NASA Technical Reports Server (NTRS)
Gross, Bernard; Gyekenyesi, John P.
1989-01-01
A structural ceramic analysis and reliability evaluation code has recently been developed encompassing volume and surface flaw induced fracture, modeled by the two-parameter Weibull probability density function. A segment of the software involves computing the Weibull polydimensional stress state crack density coefficient from uniaxial stress experimental fracture data. The relationship of the polydimensional stress coefficient to the uniaxial stress coefficient is derived for a shear-insensitive material with a random surface flaw population.
Interpreting the Weibull fitting parameters for diffusion-controlled release data
NASA Astrophysics Data System (ADS)
Ignacio, Maxime; Chubynsky, Mykyta V.; Slater, Gary W.
2017-11-01
We examine the diffusion-controlled release of molecules from passive delivery systems using both analytical solutions of the diffusion equation and numerically exact Lattice Monte Carlo data. For very short times, the release process follows a √{ t } power law, typical of diffusion processes, while the long-time asymptotic behavior is exponential. The crossover time between these two regimes is determined by the boundary conditions and initial loading of the system. We show that while the widely used Weibull function provides a reasonable fit (in terms of statistical error), it has two major drawbacks: (i) it does not capture the correct limits and (ii) there is no direct connection between the fitting parameters and the properties of the system. Using a physically motivated interpolating fitting function that correctly includes both time regimes, we are able to predict the values of the Weibull parameters which allows us to propose a physical interpretation.
Four Theorems on the Psychometric Function
May, Keith A.; Solomon, Joshua A.
2013-01-01
In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, . This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull “slope” parameter, , can be approximated by , where is the of the Weibull function that fits best to the cumulative noise distribution, and depends on the transducer. We derive general expressions for and , from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when , . We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4–0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of for contrast discrimination suggests that, if internal noise is stimulus-independent, it has lower kurtosis than a Gaussian. PMID:24124456
NASA Astrophysics Data System (ADS)
Witt, Annette; Ehlers, Frithjof; Luther, Stefan
2017-09-01
We have analyzed symbol sequences of heart beat annotations obtained from 24-h electrocardiogram recordings of 184 post-infarction patients (from the Cardiac Arrhythmia Suppression Trial database, CAST). In the symbol sequences, each heart beat was coded as an arrhythmic or as a normal beat. The symbol sequences were analyzed with a model-based approach which relies on two-parametric peaks over the threshold (POT) model, interpreting each premature ventricular contraction (PVC) as an extreme event. For the POT model, we explored (i) the Shannon entropy which was estimated in terms of the Lempel-Ziv complexity, (ii) the shape parameter of the Weibull distribution that best fits the PVC return times, and (iii) the strength of long-range correlations quantified by detrended fluctuation analysis (DFA) for the two-dimensional parameter space. We have found that in the frame of our model the Lempel-Ziv complexity is functionally related to the shape parameter of the Weibull distribution. Thus, two complementary measures (entropy and strength of long-range correlations) are sufficient to characterize realizations of the two-parametric model. For the CAST data, we have found evidence for an intermediate strength of long-range correlations in the PVC timings, which are correlated to the age of the patient: younger post-infarction patients have higher strength of long-range correlations than older patients. The normalized Shannon entropy has values in the range 0.5
On the non-Poissonian repetition pattern of FRB121102
NASA Astrophysics Data System (ADS)
Oppermann, Niels; Yu, Hao-Ran; Pen, Ue-Li
2018-04-01
The Fast Radio Burst FRB121102 has been observed to repeat in an irregular fashion. Using published timing data of the observed bursts, we show that Poissonian statistics are not a good description of this random process. As an alternative, we suggest to describe the intervals between bursts with a Weibull distribution with a shape parameter smaller than one, which allows for the clustered nature of the bursts. We quantify the amount of clustering using the parameters of the Weibull distribution and discuss the consequences that it has for the detection probabilities of future observations and for the optimization of observing strategies. Allowing for this generalization, we find a mean repetition rate of r=5.7^{+3.0}_{-2.0} per day and index k=0.34^{+0.06}_{-0.05} for a correlation function ξ(t) = (t/t0)k - 1.
Juckett, D A; Rosenberg, B
1992-04-21
The distributions for human disease-specific mortality exhibit two striking characteristics: survivorship curves that intersect near the longevity limit; and, the clustering of best-fitting Weibull shape parameter values into groups centered on integers. Correspondingly, we have hypothesized that the distribution intersections result from either competitive processes or population partitioning and the integral clustering in the shape parameter results from the occurrence of a small number of rare, rate-limiting events in disease progression. In this report we initiate a theoretical examination of these questions by exploring serial chain model dynamics and parameteric competing risks theory. The links in our chain models are composed of more than one bond, where the number of bonds in a link are denoted the link size and are the number of events necessary to break the link and, hence, the chain. We explored chains with all links of the same size or with segments of the chain composed of different size links (competition). Simulations showed that chain breakage dynamics depended on the weakest-link principle and followed kinetics of extreme-values which were very similar to human mortality kinetics. In particular, failure distributions for simple chains were Weibull-type extreme-value distributions with shape parameter values that were identifiable with the integral link size in the limit of infinite chain length. Furthermore, for chains composed of several segments of differing link size, the survival distributions for the various segments converged at a point in the S(t) tails indistinguishable from human data. This was also predicted by parameteric competing risks theory using Weibull underlying distributions. In both the competitive chain simulations and the parametric competing risks theory, however, the shape values for the intersecting distributions deviated from the integer values typical of human data. We conclude that rare events can be the source of integral shapes in human mortality, that convergence is a salient feature of multiple endpoints, but that pure competition may not be the best explanation for the exact type of convergence observable in human mortality. Finally, while the chain models were not motivated by any specific biological structures, interesting biological correlates to them may be useful in gerontological research.
Quadratic RK shooting solution for a environmental parameter prediction boundary value problem
NASA Astrophysics Data System (ADS)
Famelis, Ioannis Th.; Tsitouras, Ch.
2014-10-01
Using tools of Information Geometry, the minimum distance between two elements of a statistical manifold is defined by the corresponding geodesic, e.g. the minimum length curve that connects them. Such a curve, where the probability distribution functions in the case of our meteorological data are two parameter Weibull distributions, satisfies a 2nd order Boundary Value (BV) system. We study the numerical treatment of the resulting special quadratic form system using Shooting method. We compare the solutions of the problem when we employ a classical Singly Diagonally Implicit Runge Kutta (SDIRK) 4(3) pair of methods and a quadratic SDIRK 5(3) pair . Both pairs have the same computational costs whereas the second one attains higher order as it is specially constructed for quadratic problems.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
A log-Weibull spatial scan statistic for time to event data.
Usman, Iram; Rosychuk, Rhonda J
2018-06-13
Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallocchia, G.; Laurenza, M.; Consolini, G.
2017-03-10
Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was sweptmore » by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.« less
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
An EOQ model for weibull distribution deterioration with time-dependent cubic demand and backlogging
NASA Astrophysics Data System (ADS)
Santhi, G.; Karthikeyan, K.
2017-11-01
In this article we introduce an economic order quantity model with weibull deterioration and time dependent cubic demand rate where holding costs as a linear function of time. Shortages are allowed in the inventory system are partially and fully backlogging. The objective of this model is to minimize the total inventory cost by using the optimal order quantity and the cycle length. The proposed model is illustrated by numerical examples and the sensitivity analysis is performed to study the effect of changes in parameters on the optimum solutions.
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
NASA Astrophysics Data System (ADS)
Kundu, Pradeep; Nath, Tameshwer; Palani, I. A.; Lad, Bhupesh K.
2018-06-01
The present paper tackles an important but unmapped problem of the reliability estimations of smart materials. First, an experimental setup is developed for accelerated life testing of the shape memory alloy (SMA) springs. Generalized log-linear Weibull (GLL-Weibull) distribution-based novel approach is then developed for SMA spring life estimation. Applied stimulus (voltage), elongation and cycles of operation are used as inputs for the life prediction model. The values of the parameter coefficients of the model provide better interpretability compared to artificial intelligence based life prediction approaches. In addition, the model also considers the effect of operating conditions, making it generic for a range of the operating conditions. Moreover, a Bayesian framework is used to continuously update the prediction with the actual degradation value of the springs, thereby reducing the uncertainty in the data and improving the prediction accuracy. In addition, the deterioration of material with number of cycles is also investigated using thermogravimetric analysis and scanning electron microscopy.
Assessing a Tornado Climatology from Global Tornado Intensity Distributions.
NASA Astrophysics Data System (ADS)
Feuerstein, Bernold; Dotzek, Nikolai; Grieser, Jürgen
2005-02-01
Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if only tornado reports of F1 and higher intensity are used and that the c-b correlation does indeed reflect a universal feature of the observed tornado intensity distributions. For regions with likely supercell tornado dominance, this feature is the number ratio of F4 to F3 tornado reports R(F4/F3) = 0.238. The c-b diagram for the Weibull shape and scale parameters is used as a climatological chart, which allows different types of tornado climatology to be distinguished, presumably arising from supercell versus nonsupercell tornadogenesis. Assuming temporal invariance of the climatology and using a detection efficiency function for tornado observations, a stationary climatological probability distribution from large tornado records (U.S. decadal data 1950-99) is extracted. This can be used for risk assessment, comparative studies on tornado intensity distributions worldwide, and estimates of the degree of underreporting for areas with poor databases. For the 1990s U.S. data, a likely tornado underreporting of the weak events (F0, F1) by a factor of 2 can be diagnosed, as well as asymptotic climatological c,b values of c = 1.79 and b = 2.13, to which a convergence in the 1950-99 U.S. decadal data is verified.
Coercivity mechanisms and thermal stability of thin film magnetic recording media
NASA Astrophysics Data System (ADS)
Yang, Cheng
1999-09-01
Coercivity mechanisms and thermal stability of magnetic recording media were studied. It was found that magnetization reversal mainly occurs by nucleation mechanism. The correlation was established between the c/ a ratio of Co HCP structure and other process parameters that are thought to be the dominant factors in determining the anisotropy and therefore the coercivity of Co based thin film magnetic recording media. Time decay and switching of the magnetization in thin film magnetic recording media depend on the grain size distribution and easy-axis orientation distribution according to the proposed two- energy-level model. Relaxation time is the most fundamental parameter that determines the time decay performance of the magnetic recording media. An algorithm was proposed to calculate its distribution directly from the experimental data without any presumption. It was found for the first time that the distribution of relaxation time takes the form of Weibull distribution.
A New Lifetime Distribution with Bathtube and Unimodal Hazard Function
NASA Astrophysics Data System (ADS)
Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.
2008-11-01
In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.
NASA Technical Reports Server (NTRS)
Choi, Sung R.; Salem, Jonathan A.; Holland, Frederic A.
1997-01-01
The two estimation methods, individual data and arithmetic mean methods, were used to determine the slow crack growth (SCG) parameters (n and D) of advanced ceramics and glass from a large number of room- and elevated-temperature constant stress-rate ('dynamic fatigue') test data. For ceramic materials with Weibull modulus greater than 10, the difference in the SCG parameters between the two estimation methods was negligible; whereas, for glass specimens exhibiting Weibull modulus of about 3, the difference was amplified, resulting in a maximum difference of 16 and 13 %, respectively, in n and D. Of the two SCG parameters, the parameter n was more sensitive to the estimation method than the other. The coefficient of variation in n was found to be somewhat greater in the individual data method than in the arithmetic mean method.
Scaling in the distribution of intertrade durations of Chinese stocks
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Chen, Wei; Zhou, Wei-Xing
2008-10-01
The distribution of intertrade durations, defined as the waiting times between two consecutive transactions, is investigated based upon the limit order book data of 23 liquid Chinese stocks listed on the Shenzhen Stock Exchange in the whole year 2003. A scaling pattern is observed in the distributions of intertrade durations, where the empirical density functions of the normalized intertrade durations of all 23 stocks collapse onto a single curve. The scaling pattern is also observed in the intertrade duration distributions for filled and partially filled trades and in the conditional distributions. The ensemble distributions for all stocks are modeled by the Weibull and the Tsallis q-exponential distributions. Maximum likelihood estimation shows that the Weibull distribution outperforms the q-exponential for not-too-large intertrade durations which account for more than 98.5% of the data. Alternatively, nonlinear least-squares estimation selects the q-exponential as a better model, in which the optimization is conducted on the distance between empirical and theoretical values of the logarithmic probability densities. The distribution of intertrade durations is Weibull followed by a power-law tail with an asymptotic tail exponent close to 3.
Four theorems on the psychometric function.
May, Keith A; Solomon, Joshua A
2013-01-01
In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, Δx. This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull "slope" parameter, β, can be approximated by β(Noise) x β(Transducer), where β(Noise) is the β of the Weibull function that fits best to the cumulative noise distribution, and β(Transducer) depends on the transducer. We derive general expressions for β(Noise) and β(Transducer), from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when d' ∝ (Δx)(b), β ≈ β(Noise) x b. We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4-0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull β reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of β for contrast discrimination suggests that, if internal noise is stimulus-independent, it has lower kurtosis than a Gaussian.
NASA Astrophysics Data System (ADS)
Bhushan, Awani; Panda, S. K.
2018-05-01
The influence of bimodularity (different stress ∼ strain behaviour in tension and compression) on fracture behaviour of graphite specimens has been studied with fracture toughness (KIc), critical J-integral (JIc) and critical strain energy release rate (GIc) as the characterizing parameter. Bimodularity index (ratio of tensile Young's modulus to compression Young's modulus) of graphite specimens has been obtained from the normalized test data of tensile and compression experimentation. Single edge notch bend (SENB) testing of pre-cracked specimens from the same lot have been carried out as per ASTM standard D7779-11 to determine the peak load and critical fracture parameters KIc, GIc and JIc using digital image correlation technology of crack opening displacements. Weibull weakest link theory has been used to evaluate the mean peak load, Weibull modulus and goodness of fit employing two parameter least square method (LIN2), biased (MLE2-B) and unbiased (MLE2-U) maximum likelihood estimator. The stress dependent elasticity problem of three-dimensional crack progression behaviour for the bimodular graphite components has been solved as an iterative finite element procedure. The crack characterizing parameters critical stress intensity factor and critical strain energy release rate have been estimated with the help of Weibull distribution plot between peak loads versus cumulative probability of failure. Experimental and Computational fracture parameters have been compared qualitatively to describe the significance of bimodularity. The bimodular influence on fracture behaviour of SENB graphite has been reflected on the experimental evaluation of GIc values only, which has been found to be different from the calculated JIc values. Numerical evaluation of bimodular 3D J-integral value is found to be close to the GIc value whereas the unimodular 3D J-value is nearer to the JIc value. The significant difference between the unimodular JIc and bimodular GIc indicates that GIc should be considered as the standard fracture parameter for bimodular brittle specimens.
Creep test observation of viscoelastic failure of edible fats
NASA Astrophysics Data System (ADS)
Vithanage, C. R.; Grimson, M. J.; Smith, B. G.; Wills, P. R.
2011-03-01
A rheological creep test was used to investigate the viscoelastic failure of five edible fats. Butter, spreadable blend and spread were selected as edible fats because they belong to three different groups according to the Codex Alimentarius. Creep curves were analysed according to the Burger model. Results were fitted to a Weibull distribution representing the strain-dependent lifetime of putative fibres in the material. The Weibull shape and scale (lifetime) parameters were estimated for each substance. A comparison of the rheometric measurements of edible fats demonstrated a clear difference between the three different groups. Taken together the results indicate that butter has a lower threshold for mechanical failure than spreadable blend and spread. The observed behaviour of edible fats can be interpreted using a model in which there are two types of bonds between fat crystals; primary bonds that are strong and break irreversibly, and secondary bonds, which are weaker but break and reform reversibly.
Migration kinetics of four photo-initiators from paper food packaging to solid food simulants.
Cai, Huimei; Ji, Shuilin; Zhang, Juzhou; Tao, Gushuai; Peng, Chuanyi; Hou, Ruyan; Zhang, Liang; Sun, Yue; Wan, Xiaochun
2017-09-01
The migration behaviour of four photo-initiators (BP, EHA, MBP and Irgacure 907) was studied by 'printing' onto four different food-packaging materials (Kraft paper, white cardboard, Polyethylene (PE)-coated paper and composite paper) and tracking movement into the food simulant: Tenax-TA (porous polymer 2,6-diphenyl furan resin). The results indicated that the migration of the photo-initiators was related to the molecular weight and log K o/w of each photo-initiator. At different temperatures, the migration rates of the photo-initiators were different in papers with different thicknesses. The amount of each photo-initiator found in the food was closely related to the food matrix. The Weibull model was used to predict the migration load into the food simulants by calculating the parameters τ and β and determining the relationship of the two parameters with temperature and paper thickness. The established Weibull model was then used to predict the migration of each photo-initiator with respect to different foods. A two-parameter Weibull model fitted the actual situation, with some deviation from the actual migration amount.
Time Variations in Forecasts and Occurrences of Large Solar Energetic Particle Events
NASA Astrophysics Data System (ADS)
Kahler, S. W.
2015-12-01
The onsets and development of large solar energetic (E > 10 MeV) particle (SEP) events have been characterized in many studies. The statistics of SEP event onset delay times from associated solar flares and coronal mass ejections (CMEs), which depend on solar source longitudes, can be used to provide better predictions of whether a SEP event will occur following a large flare or fast CME. In addition, size distributions of peak SEP event intensities provide a means for a probabilistic forecast of peak intensities attained in observed SEP increases. SEP event peak intensities have been compared with their rise and decay times for insight into the acceleration and transport processes. These two time scales are generally treated as independent parameters describing the development of a SEP event, but we can invoke an alternative two-parameter description based on the assumption that decay times exceed rise times for all events. These two parameters, from the well known Weibull distribution, provide an event description in terms of its basic shape and duration. We apply this distribution to several large SEP events and ask what the characteristic parameters and their dependence on source longitudes can tell us about the origins of these important events.
Bonded-cell model for particle fracture.
Nguyen, Duc-Hanh; Azéma, Emilien; Sornay, Philippe; Radjai, Farhang
2015-02-01
Particle degradation and fracture play an important role in natural granular flows and in many applications of granular materials. We analyze the fracture properties of two-dimensional disklike particles modeled as aggregates of rigid cells bonded along their sides by a cohesive Mohr-Coulomb law and simulated by the contact dynamics method. We show that the compressive strength scales with tensile strength between cells but depends also on the friction coefficient and a parameter describing cell shape distribution. The statistical scatter of compressive strength is well described by the Weibull distribution function with a shape parameter varying from 6 to 10 depending on cell shape distribution. We show that this distribution may be understood in terms of percolating critical intercellular contacts. We propose a random-walk model of critical contacts that leads to particle size dependence of the compressive strength in good agreement with our simulation data.
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
NASA Astrophysics Data System (ADS)
Linstrom, Elizabeth Jane
A new approach to the nondestructive evaluation of polymer matrix/graphite fiber composites is presented. This technique permits the determination of the top ply bond strength of a laminate based on the results of ultrasonic testing. This technique is designed to be used for the real-time, nondestructive evaluation of composites during tape laying. By separately bonding the top ply of thermoset and thermoplastic polymer composite laminates, a poor ply bond was achieved solely at the interface of the top ply and the rest of the laminate. Using angled incidence, a 5 MHz, 4 musecond ultrasonic pulse was induced into the composite samples. This created waves traveling along the surface of the composite samples that were picked up by a receiving transducer. The received signal was cross-correlated with an artificially constructed replica of the input signal. The maximum amplitude of the cross-correlated signal was recorded. The cross-correlated signal was then converted to the frequency spectra using a fast Fourier transform. The maximum amplitude of the frequency spectra was then recorded. These measurements were repeated at 18 to 30 different locations on each composite sample. The resulting collection of maximum amplitudes of cross-correlated signals and frequency spectra were fit to two parameter Weibull distributions. The composite samples were destructively evaluated using a flat-wise tensile test. The B-basis values of the ultrasonic data Weibull distributions were compared to the B-basis values of the Weibull distribution of the strength data. A good correlation was found.
ZERODUR: bending strength data for etched surfaces
NASA Astrophysics Data System (ADS)
Hartmann, Peter; Leys, Antoine; Carré, Antoine; Kerz, Franca; Westerhoff, Thomas
2014-07-01
In a continuous effort since 2007 a considerable amount of new data and information has been gathered on the bending strength of the extremely low thermal expansion glass ceramic ZERODUR®. By fitting a three parameter Weibull distribution to the data it could be shown that for homogenously ground surfaces minimum breakage stresses exist lying much higher than the previously applied design limits. In order to achieve even higher allowable stress values diamond grain ground surfaces have been acid etched, a procedure widely accepted as strength increasing measure. If surfaces are etched taking off layers with thickness which are comparable to the maximum micro crack depth of the preceding grinding process they also show statistical distributions compatible with a three parameter Weibull distribution. SCHOTT has performed additional measurement series with etch solutions with variable composition testing the applicability of this distribution and the possibility to achieve further increase of the minimum breakage stress. For long term loading applications strength change with time and environmental media are important. The parameter needed for prediction calculations which is combining these influences is the stress corrosion constant. Results from the past differ significantly from each other. On the basis of new investigations better information will be provided for choosing the best value for the given application conditions.
Pal, Suvra; Balakrishnan, N
2017-10-01
In this paper, we consider a competing cause scenario and assume the number of competing causes to follow a Conway-Maxwell Poisson distribution which can capture both over and under dispersion that is usually encountered in discrete data. Assuming the population of interest having a component cure and the form of the data to be interval censored, as opposed to the usually considered right-censored data, the main contribution is in developing the steps of the expectation maximization algorithm for the determination of the maximum likelihood estimates of the model parameters of the flexible Conway-Maxwell Poisson cure rate model with Weibull lifetimes. An extensive Monte Carlo simulation study is carried out to demonstrate the performance of the proposed estimation method. Model discrimination within the Conway-Maxwell Poisson distribution is addressed using the likelihood ratio test and information-based criteria to select a suitable competing cause distribution that provides the best fit to the data. A simulation study is also carried out to demonstrate the loss in efficiency when selecting an improper competing cause distribution which justifies the use of a flexible family of distributions for the number of competing causes. Finally, the proposed methodology and the flexibility of the Conway-Maxwell Poisson distribution are illustrated with two known data sets from the literature: smoking cessation data and breast cosmesis data.
Tensile Strength and Microstructural Characterization of Uncoated and Coated HPZ Ceramic Fibers
NASA Technical Reports Server (NTRS)
Bansal, Narottam P.; Wheeler, Donald R.; Dickerson, Robert M.
1996-01-01
Tensile strengths of as-received HPZ fiber and those surface coated with BN, BN/SiC, and BN/Si3N4 have been determined at room temperature using a two-parameter Weibull distribution. Nominally approx. 0.4 micron BN and 0.2 micron SiC or Si3N4 coatings were deposited on the fibers by chemical vapor deposition using a continuous reactor. The average tensile strength of uncoated HPZ fiber was 2.0 +/- 0.56 GPa (290 +/- 81 ksi) with a Weibull modulus of 4.1. For the BN coated fibers, the average strength and the Weibull modulus increased to 2.39 +/- 0.44 GPa (346 +/- 64 ksi) and 6.5, respectively. The HPZ/BN/SiC fibers showed an average strength of 2.0 +/- 0.32 GPa (290 +/- 47 ksi) and Weibull modulus of 7.3. Average strength of the fibers having a dual BN/Si3N4 surface coating degraded to 1.15 +/- 0.26 GPa (166 +/- 38 ksi) with a Weibull modulus of 5.3. The chemical composition and thickness of the fiber coatings were determined using scanning Auger analysis. Microstructural analysis of the fibers and the coatings was carried out by scanning electron microscopy and transmission electron microscopy. A microporous silica-rich layer approx. 200 nm thick is present on the as-received HPZ fiber surface. The BN coatings on the fibers are amorphous to partly turbostratic and contaminated with carbon and oxygen. Silicon carbide coating was crystalline whereas the silicon nitride coating was amorphous. The silicon carbide and silicon nitride coatings are non-stoichiometric, non-uniform, and granular. Within a fiber tow, the fibers on the outside had thicker and more granular coatings than those on the inside.
Statistical wind analysis for near-space applications
NASA Astrophysics Data System (ADS)
Roney, Jason A.
2007-09-01
Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.
Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Meiyun; Long, Shibing, E-mail: longshibing@ime.ac.cn; Wang, Guoming
2014-11-10
The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electronmore » transport model. Our work provides indications for the improvement of the switching uniformity.« less
Balakrishnan, Narayanaswamy; Pal, Suvra
2016-08-01
Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence. © The Author(s) 2013.
A New Goodness-of-Fit Test for the Weibull Distribution Based on Spacings
1993-03-01
Values for Z* test statistic: Samplesize N, shape parameter 1.0, a levels are 0.20 thru 0.01 ........................... .. 24 3. Skewness of the...parameter K=0.5, a levels are 0.20 thru 0.01 ....... ............................ 30 5. Power of the Test: Samplesize N=20, shape parameter K=1.0, a ...parameter 1.0, alpha level 0.01 ...... ... 36 12. Power of the Test: Samplesize N=30, shape parameter K=1.5, a levels are 0.20 thru 0.01
Rafal Podlaski; Francis Roesch
2014-01-01
In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components,...
Hazard function analysis for flood planning under nonstationarity
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-05-01
The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.
CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
2003-01-01
This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.
Weibull models of fracture strengths and fatigue behavior of dental resins in flexure and shear.
Baran, G R; McCool, J I; Paul, D; Boberick, K; Wunder, S
1998-01-01
In estimating lifetimes of dental restorative materials, it is useful to have available data on the fatigue behavior of these materials. Current efforts at estimation include several untested assumptions related to the equivalence of flaw distributions sampled by shear, tensile, and compressive stresses. Environmental influences on material properties are not accounted for, and it is unclear if fatigue limits exist. In this study, the shear and flexural strengths of three resins used as matrices in dental restorative composite materials were characterized by Weibull parameters. It was found that shear strengths were lower than flexural strengths, liquid sorption had a profound effect on characteristic strengths, and the Weibull shape parameter obtained from shear data differed for some materials from that obtained in flexure. In shear and flexural fatigue, a power law relationship applied for up to 250,000 cycles; no fatigue limits were found, and the data thus imply only one flaw population is responsible for failure. Again, liquid sorption adversely affected strength levels in most materials (decreasing shear strengths and flexural strengths by factors of 2-3) and to a greater extent than did the degree of cure or material chemistry.
On the robustness of a Bayes estimate. [in reliability theory
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1974-01-01
This paper examines the robustness of a Bayes estimator with respect to the assigned prior distribution. A Bayesian analysis for a stochastic scale parameter of a Weibull failure model is summarized in which the natural conjugate is assigned as the prior distribution of the random parameter. The sensitivity analysis is carried out by the Monte Carlo method in which, although an inverted gamma is the assigned prior, realizations are generated using distribution functions of varying shape. For several distributional forms and even for some fixed values of the parameter, simulated mean squared errors of Bayes and minimum variance unbiased estimators are determined and compared. Results indicate that the Bayes estimator remains squared-error superior and appears to be largely robust to the form of the assigned prior distribution.
Historical floods in flood frequency analysis: Is this game worth the candle?
NASA Astrophysics Data System (ADS)
Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa
2017-11-01
In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.
NASA Astrophysics Data System (ADS)
Sazuka, Naoya
2007-03-01
We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.
Rafal Podlaski; Francis A. Roesch
2014-01-01
Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...
Universal behaviour in the stock market: Time dynamics of the electronic orderbook
NASA Astrophysics Data System (ADS)
Kızılersü, Ayşe; Kreer, Markus; Thomas, Anthony W.; Feindt, Michael
2016-07-01
A consequence of the digital revolution is that share trading at the stock exchange takes place via electronic order books which are accessed by traders and investors via the internet. Our empirical findings of the London Stock Exchange demonstrate that once ultra-high frequency manipulation on time scales less than around ten milliseconds is excluded, all relevant changes in the order book happen with time differences that are randomly distributed and well described by a left-truncated Weibull distribution with universal shape parameter (independent of time and same for all stocks). The universal shape parameter corresponds to maximum entropy of the distribution.
Langenbucher, Frieder
2003-01-01
MS Excel is a useful tool to handle in vitro/in vivo correlation (IVIVC) distribution functions, with emphasis on the Weibull and the biexponential distribution, which are most useful for the presentation of cumulative profiles, e.g. release in vitro or urinary excretion in vivo, and differential profiles such as the plasma response in vivo. The discussion includes moments (AUC and mean) as summarizing statistics, and data-fitting algorithms for parameter estimation.
Experimental investigation of mode I fracture for brittle tube-shaped particles
NASA Astrophysics Data System (ADS)
Stasiak, Marta; Combe, Gaël; Desrues, Jacques; Richefeu, Vincent; Villard, Pascal; Armand, Gilles; Zghondi, Jad
2017-06-01
We focus herein on the mechanical behavior of highly crushable grains. The object of our interest, named shell, is a hollow cylinder grain with ring cross-section, made of baked clay. The objective is to model the fragmentation of such shells, by means of discrete element (DE) approach. To this end, fracture modes I (opening fracture) and II (in-plane shear fracture) have to be investigated experimentally. This paper is essentially dedicated to mode I fracture. Therefore, a campaign of Brazilian-like compression tests, that result in crack opening, has been performed. The distribution of the occurrence of tensile strength is shown to obey a Weibull distribution for the studied shells, and Weibull's modulus was quantified. Finally, an estimate of the numerical/physical parameters required in a DE model (local strength), is proposed on the basis of the energy required to fracture through a given surface in mode I or II.
Idealized models of the joint probability distribution of wind speeds
NASA Astrophysics Data System (ADS)
Monahan, Adam H.
2018-05-01
The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.
On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios
2013-04-01
The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006. Growth-collapse and decay-surge evolutions, and geometric Langevin equations, Physica A, 367, 106 - 128.
Application of Weibull analysis to SSME hardware
NASA Technical Reports Server (NTRS)
Gray, L. A. B.
1986-01-01
Generally, it has been documented that the wearing of engine parts forms a failure distribution which can be approximated by a function developed by Weibull. The purpose here is to examine to what extent the Weibull distribution approximates failure data for designated engine parts of the Space Shuttle Main Engine (SSME). The current testing certification requirements will be examined in order to establish confidence levels. An examination of the failure history of SSME parts/assemblies (turbine blades, main combustion chamber, or high pressure fuel pump first stage impellers) which are limited in usage by time or starts will be done by using updated Weibull techniques. Efforts will be made by the investigator to predict failure trends by using Weibull techniques for SSME parts (turbine temperature sensors, chamber pressure transducers, actuators, and controllers) which are not severely limited by time or starts.
Rafal Podlaski; Francis .A. Roesch
2013-01-01
The goals of this study are (1) to analyse the accuracy of the approximation of empirical distributions of diameter at breast height (dbh) using two-component mixtures of either the Weibull distribution or the gamma distribution in two−cohort stands, and (2) to discuss the procedure of choosing goodness−of−fit tests. The study plots were...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.
2015-02-10
In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizesmore » the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)« less
Effect of Individual Component Life Distribution on Engine Life Prediction
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.
2003-01-01
The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.
Modeling Mass and Thermal Transport in Thin Porous Media of PEM Fuel Cells
NASA Astrophysics Data System (ADS)
Konduru, Vinaykumar
Water transport in the Porous Transport Layer (PTL) plays an important role in the efficient operation of polymer electrolyte membrane fuel cells (PEMFC). Excessive water content as well as dry operating conditions are unfavorable for efficient and reliable operation of the fuel cell. The effect of thermal conductivity and porosity on water management are investigated by simulating two-phase flow in the PTL of the fuel cell using a network model. In the model, the PTL consists of a pore-phase and a solid-phase. Different models of the PTLs are generated using independent Weibull distributions for the pore-phase and the solid-phase. The specific arrangement of the pores and solid elements is varied to obtain different PTL realizations for the same Weibull parameters. The properties of PTL are varied by changing the porosity and thermal conductivity. The parameters affecting operating conditions include the temperature, relative humidity in the flow channel and voltage and current density. In addition, a novel high-speed capable Surface Plasmon Resonance (SPR) microscope was built based on Kretschmann's configuration utilizing a collimated Kohler illumination. The SPR allows thin film characterization in a thickness of approximately 0-200nm by measuring the changes in the refractive index. Various independent experiments were run to measure film thickness during droplet coalescence during condensation.
Influence of the bracket on bonding and physical behavior of orthodontic resin cements.
Bolaños-Carmona, Victoria; Zein, Bilal; Menéndez-Núñez, Mario; Sánchez-Sánchez, Purificación; Ceballos-García, Laura; González-López, Santiago
2015-01-01
The aim of the study is to determine the influence of the type of bracket, on bond strength, microhardness and conversion degree (CD) of four resin orthodontic cements. Micro-tensile bond strength (µTBS) test between the bracket base and the cement was carried out on glass-hour-shaped specimens (n=20). Vickers Hardness Number (VHN) and micro-Raman spectra were recorded in situ under the bracket base. Weibull distribution, ANOVA and non-parametric test were applied for data analysis (p<0.05). The highest values of ή as well as the β Weibull parameter were obtained for metallic brackets with Transbond™ plastic brackets with the self-curing cement showing the worst performance. The CD was from 80% to 62.5%.
NASA Astrophysics Data System (ADS)
Hasan, Md. Fahad; Wang, James; Berndt, Christopher
2015-06-01
The microhardness and elastic modulus of plasma-sprayed hydroxyapatite coatings were evaluated using Knoop indentation on the cross section and on the top surface. The effects of indentation angle, testing direction, measurement location and applied load on the microhardness and elastic modulus were investigated. The variability and distribution of the microhardness and elastic modulus data were statistically analysed using the Weibull modulus distribution. The results indicate that the dependence of microhardness and elastic modulus on the indentation angle exhibits a parabolic shape. Dependence of the microhardness values on the indentation angle follows Pythagoras's theorem. The microhardness, Weibull modulus of microhardness and Weibull modulus of elastic modulus reach their maximum at the central position (175 µm) on the cross section of the coatings. The Weibull modulus of microhardness revealed similar values throughout the thickness, and the Weibull modulus of elastic modulus shows higher values on the top surface compared to the cross section.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-06-29
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2007-04-01
Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57 and others.
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2013-01-01
Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives
The impacts of precipitation amount simulation on hydrological modeling in Nordic watersheds
NASA Astrophysics Data System (ADS)
Li, Zhi; Brissette, Fancois; Chen, Jie
2013-04-01
Stochastic modeling of daily precipitation is very important for hydrological modeling, especially when no observed data are available. Precipitation is usually modeled by two component model: occurrence generation and amount simulation. For occurrence simulation, the most common method is the first-order two-state Markov chain due to its simplification and good performance. However, various probability distributions have been reported to simulate precipitation amount, and spatiotemporal differences exist in the applicability of different distribution models. Therefore, assessing the applicability of different distribution models is necessary in order to provide more accurate precipitation information. Six precipitation probability distributions (exponential, Gamma, Weibull, skewed normal, mixed exponential, and hybrid exponential/Pareto distributions) are directly and indirectly evaluated on their ability to reproduce the original observed time series of precipitation amount. Data from 24 weather stations and two watersheds (Chute-du-Diable and Yamaska watersheds) in the province of Quebec (Canada) are used for this assessment. Various indices or statistics, such as the mean, variance, frequency distribution and extreme values are used to quantify the performance in simulating the precipitation and discharge. Performance in reproducing key statistics of the precipitation time series is well correlated to the number of parameters of the distribution function, and the three-parameter precipitation models outperform the other models, with the mixed exponential distribution being the best at simulating daily precipitation. The advantage of using more complex precipitation distributions is not as clear-cut when the simulated time series are used to drive a hydrological model. While the advantage of using functions with more parameters is not nearly as obvious, the mixed exponential distribution appears nonetheless as the best candidate for hydrological modeling. The implications of choosing a distribution function with respect to hydrological modeling and climate change impact studies are also discussed.
Accuracy of Time Phasing Aircraft Development using the Continuous Distribution Function
2015-03-26
Breusch - Pagan test ; the reported p-value of 0.5264 fails to rejects the null hypothesis of constant... Breusch - Pagan Test : P-value – 0.6911 0 2 4 6 8 10 12 -1 -0.75 -0.5 -0.25 0 0.25 0.5 0.75 1 Shapiro-Wilk W Test Prob. < W: 0.9849 -1...Weibull Scale Parameter β – Constant Variance Breusch - Pagan Test : P-value – 0.5176 Beta Shape Parameter α – Influential Data
Establishment of a center of excellence for applied mathematical and statistical research
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
The state of the art was assessed with regards to efforts in support of the crop production estimation problem and alternative generic proportion estimation techniques were investigated. Topics covered include modeling the greeness profile (Badhwarmos model), parameter estimation using mixture models such as CLASSY, and minimum distance estimation as an alternative to maximum likelihood estimation. Approaches to the problem of obtaining proportion estimates when the underlying distributions are asymmetric are examined including the properties of Weibull distribution.
Estimating Tree Height-Diameter Models with the Bayesian Method
Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733
Estimating tree height-diameter models with the Bayesian method.
Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.
CARES/PC - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES
NASA Technical Reports Server (NTRS)
Szatmary, S. A.
1994-01-01
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided when the maximum likelihood technique is used. CARES/PC is written and compiled with the Microsoft FORTRAN v5.0 compiler using the VAX FORTRAN extensions and dynamic array allocation supported by this compiler for the IBM/MS-DOS or OS/2 operating systems. The dynamic array allocation routines allow the user to match the number of fracture sets and test specimens to the memory available. Machine requirements include IBM PC compatibles with optional math coprocessor. Program output is designed to fit 80-column format printers. Executables for both DOS and OS/2 are provided. CARES/PC is distributed on one 5.25 inch 360K MS-DOS format diskette in compressed format. The expansion tool PKUNZIP.EXE is supplied on the diskette. CARES/PC was developed in 1990. IBM PC and OS/2 are trademarks of International Business Machines. MS-DOS and MS OS/2 are trademarks of Microsoft Corporation. VAX is a trademark of Digital Equipment Corporation.
NASA Astrophysics Data System (ADS)
Li, T.; Griffiths, W. D.; Chen, J.
2017-11-01
The Maximum Likelihood method and the Linear Least Squares (LLS) method have been widely used to estimate Weibull parameters for reliability of brittle and metal materials. In the last 30 years, many researchers focused on the bias of Weibull modulus estimation, and some improvements have been achieved, especially in the case of the LLS method. However, there is a shortcoming in these methods for a specific type of data, where the lower tail deviates dramatically from the well-known linear fit in a classic LLS Weibull analysis. This deviation can be commonly found from the measured properties of materials, and previous applications of the LLS method on this kind of dataset present an unreliable linear regression. This deviation was previously thought to be due to physical flaws ( i.e., defects) contained in materials. However, this paper demonstrates that this deviation can also be caused by the linear transformation of the Weibull function, occurring in the traditional LLS method. Accordingly, it may not be appropriate to carry out a Weibull analysis according to the linearized Weibull function, and the Non-linear Least Squares method (Non-LS) is instead recommended for the Weibull modulus estimation of casting properties.
Bayesian Weibull tree models for survival analysis of clinico-genomic data
Clarke, Jennifer; West, Mike
2008-01-01
An important goal of research involving gene expression data for outcome prediction is to establish the ability of genomic data to define clinically relevant risk factors. Recent studies have demonstrated that microarray data can successfully cluster patients into low- and high-risk categories. However, the need exists for models which examine how genomic predictors interact with existing clinical factors and provide personalized outcome predictions. We have developed clinico-genomic tree models for survival outcomes which use recursive partitioning to subdivide the current data set into homogeneous subgroups of patients, each with a specific Weibull survival distribution. These trees can provide personalized predictive distributions of the probability of survival for individuals of interest. Our strategy is to fit multiple models; within each model we adopt a prior on the Weibull scale parameter and update this prior via Empirical Bayes whenever the sample is split at a given node. The decision to split is based on a Bayes factor criterion. The resulting trees are weighted according to their relative likelihood values and predictions are made by averaging over models. In a pilot study of survival in advanced stage ovarian cancer we demonstrate that clinical and genomic data are complementary sources of information relevant to survival, and we use the exploratory nature of the trees to identify potential genomic biomarkers worthy of further study. PMID:18618012
Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata
2012-05-01
The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Functional models for colloid retention in porous media at the triple line.
Dathe, Annette; Zevi, Yuniati; Richards, Brian K; Gao, Bin; Parlange, J-Yves; Steenhuis, Tammo S
2014-01-01
Spectral confocal microscope visualizations of microsphere movement in unsaturated porous media showed that attachment at the Air Water Solid (AWS) interface was an important retention mechanism. These visualizations can aid in resolving the functional form of retention rates of colloids at the AWS interface. In this study, soil adsorption isotherm equations were adapted by replacing the chemical concentration in the water as independent variable by the cumulative colloids passing by. In order of increasing number of fitted parameters, the functions tested were the Langmuir adsorption isotherm, the Logistic distribution, and the Weibull distribution. The functions were fitted against colloid concentrations obtained from time series of images acquired with a spectral confocal microscope for three experiments performed where either plain or carboxylated polystyrene latex microspheres were pulsed in a small flow chamber filled with cleaned quartz sand. Both moving and retained colloids were quantified over time. In fitting the models to the data, the agreement improved with increasing number of model parameters. The Weibull distribution gave overall the best fit. The logistic distribution did not fit the initial retention of microspheres well but otherwise the fit was good. The Langmuir isotherm only fitted the longest time series well. The results can be explained that initially when colloids are first introduced the rate of retention is low. Once colloids are at the AWS interface they act as anchor point for other colloids to attach and thereby increasing the retention rate as clusters form. Once the available attachment sites diminish, the retention rate decreases.
Modern methodology of designing target reliability into rotating mechanical components
NASA Technical Reports Server (NTRS)
Kececioglu, D. B.; Chester, L. B.
1973-01-01
Experimentally determined distributional cycles-to-failure versus maximum alternating nominal strength (S-N) diagrams, and distributional mean nominal strength versus maximum alternating nominal strength (Goodman) diagrams are presented. These distributional S-N and Goodman diagrams are for AISI 4340 steel, R sub c 35/40 hardness, round, cylindrical specimens 0.735 in. in diameter and 6 in. long with a circumferential groove 0.145 in. radius for a theoretical stress concentration = 1.42 and 0.034 in. radius for a stress concentration = 2.34. The specimens are subjected to reversed bending and steady torque in specially built, three complex-fatigue research machines. Based on these results, the effects on the distributional S-N and Goodman diagrams and on service life of superimposing steady torque on reversed bending are established, as well as the effect of various stress concentrations. In addition a computer program for determining the three-parameter Weibull distribution representing the cycles-to-failure data, and two methods for calculating the reliability of components subjected to cumulative fatigue loads are given.
Analysis of Weibull Grading Test for Solid Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-01-01
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703
1981-12-01
preventing the generation of 16 6 negative location estimators. Because of the invariant pro- perty of the EDF statistics, this transformation will...likelihood. If the parameter estimation method developed by Harter and Moore is used, care must be taken to prevent the location estimators from being...vs A 2 Critical Values, Level-.Ol, n-30 128 , 0 6N m m • w - APPENDIX E Computer Prgrams 129 Program to Calculate the Cramer-von Mises Critical Values
NASA Astrophysics Data System (ADS)
Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin
2015-01-01
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage "hot spots" at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7-0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.
A Weibull characterization for tensile fracture of multicomponent brittle fibers
NASA Technical Reports Server (NTRS)
Barrows, R. G.
1977-01-01
A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.
Effect of BN coating on the strength of a mullite type fiber
NASA Astrophysics Data System (ADS)
Chawla, K. K.; Xu, Z. R.; Ha, J.-S.; Schmücker, M.; Schneider, H.
1997-09-01
Nextel 480 is a polycrystalline essentially mullite fiber (70 wt.-% Al2O3+28 wt.-% SiO2+2 wt.-% B2O3). Different thicknesses of BN were applied as coatings on this fiber. Optical, scanning electron, and transmission electron microscopy were used to characterize the microstructure of the coatings and fibers. The effects of coating and high temperature exposure on the fiber strength were investigated using two-parameter Weibull distribution. TEM examination showed that the BN coating has a turbostratic structure, with the basal planes lying predominantly parallel to the fiber surface. Such an orientation of coating is desirable for easy crack deflection and subsequent fiber pullout in a composite. The BN coated Nextel 480 fiber showed that Weibull mean strength increased first and then decreased with increasing coating thickness. This was due to the surface flaw healing effect of the coating (up to 0.3 μm) while in the case of thick BN coating (1 μm), the soft nature of the coating material had a more dominant effect and resulted in a decrease of the fiber strength. High temperature exposure of Nextel 480 resulted in grain growth, which led to a strength loss.
Jahid, Iqbal Kabir; Ha, Sang-Do
2014-05-01
The present article focuses on the inactivation kinetics of various disinfectants including ethanol, sodium hypochlorite, hydrogen peroxide, peracetic acid, and benzalkonium chloride against Aeromonas hydrophila biofilms and planktonic cells. Efficacy was determined by viable plate count and compared using a modified Weibull model. The removal of the biofilms matrix was determined by the crystal violet assay and was confirmed by field-emission scanning electron microscope. The results revealed that all the experimental data and calculated Weibull α (scale) and β (shape) parameters had a good fit, as the R(2) values were between 0.88 and 0.99. Biofilms are more resistant to disinfectants than planktonic cells. Ethanol (70%) was the most effective in killing cells in the biofilms and significantly reduced (p<0.05) the biofilms matrix. The Weibull parameter b-value correlated (R(2)=0.6835) with the biofilms matrix removal. The present findings deduce that the Weibull model is suitable to determine biofilms matrix reduction as well as the effectiveness of chemical disinfectants on biofilms. The study showed that the Weibull model could successfully be used on food and food contact surfaces to determine the exact contact time for killing biofilms-forming foodborne pathogens.
Zero expansion glass ceramic ZERODUR® roadmap for advanced lithography
NASA Astrophysics Data System (ADS)
Westerhoff, Thomas; Jedamzik, Ralf; Hartmann, Peter
2013-04-01
The zero expansion glass ceramic ZERODUR® is a well-established material in microlithography in critical components as wafer- and reticle-stages, mirrors and frames in the stepper positioning and alignment system. The very low coefficient of thermal expansion (CTE) and its extremely high CTE homogeneity are key properties to achieve the tight overlay requirements of advanced lithography processes. SCHOTT is continuously improving critical material properties of ZERODUR® essential for microlithography applications according to a roadmap driven by the ever tighter material specifications broken down from the customer roadmaps. This paper will present the SCHOTT Roadmap for ZERODUR® material property development. In the recent years SCHOTT established a physical model based on structural relaxation to describe the coefficient of thermal expansion's temperature dependence. The model is successfully applied for the new expansion grade ZERODUR® TAILORED introduced to the market in 2012. ZERODUR® TAILORED delivers the lowest thermal expansion of ZERODUR® products at microlithography tool application temperature allowing for higher thermal stability for tighter overlay control in IC production. Data will be reported demonstrating the unique CTE homogeneity of ZERODUR® and its very high reproducibility, a necessary precondition for serial production for microlithography equipment components. New data on the bending strength of ZERODUR® proves its capability to withstand much higher mechanical loads than previously reported. Utilizing a three parameter Weibull distribution it is possible to derive minimum strength values for a given ZERODUR® surface treatment. Consequently the statistical uncertainties of the earlier approach based on a two parameter Weibull distribution have been eliminated. Mechanical fatigue due to stress corrosion was included in a straightforward way. The derived formulae allows calculating life time of ZERODUR® components for a given stress load or the allowable maximum stress for a minimum required life time.
Transmission overhaul estimates for partial and full replacement at repair
NASA Technical Reports Server (NTRS)
Savage, M.; Lewicki, D. G.
1991-01-01
Timely transmission overhauls increase in-flight service reliability greater than the calculated design reliabilities of the individual aircraft transmission components. Although necessary for aircraft safety, transmission overhauls contribute significantly to aircraft expense. Predictions of a transmission's maintenance needs at the design stage should enable the development of more cost effective and reliable transmissions in the future. The frequency is estimated of overhaul along with the number of transmissions or components needed to support the overhaul schedule. Two methods based on the two parameter Weibull statistical distribution for component life are used to estimate the time between transmission overhauls. These methods predict transmission lives for maintenance schedules which repair the transmission with a complete system replacement or repair only failed components of the transmission. An example illustrates the methods.
Time-dependent strength degradation of a siliconized silicon carbide determined by dynamic fatigue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breder, K.
1995-10-01
Both fast-fracture strength and strength as a function of stressing rate at room temperature, 1,100, and 1,400 C were measured for a siliconized SiC. The fast-fracture strength increased slightly from 386 MPa at room temperature to 424 MPa at 1,100 C and then dropped to 308 MPa at 1,400 C. The Weibull moduli at room temperature and 1,100 were 10.8 and 7.8, respectively, whereas, at 1,400 C, the Weibull modulus was 2.8. The very low Weibull modulus at 1,400 C was due to the existence of two exclusive flaw populations with very different characteristic strengths. The data were reanalyzed usingmore » two exclusive flaw populations. The ceramic showed no slow crack growth (SCG), as measured by dynamic fatigue at 1,100 C, but, at 1,400 C, an SCG parameter, n, of 15.5 was measured. Fractography showed SCG zones consisting of cracks grown out from silicon-rich areas. Time-to-failure predictions at given levels of failure probabilities were performed.« less
Lysyk, T J; Danyk, T
2007-09-01
The effect of temperature on survival, oviposition, gonotrophic development, and a life history factor of vectorial capacity were examined in adult Culicoides sonorensis (Wirth & Jones) (Diptera: Ceratopogonidae) that originated from two geographic locations. Flies originating from the United States (Colorado) had slightly reduced survival after a bloodmeal compared with wild flies collected in southern Alberta (AB), Canada. Survival of AB flies declined in a curvilinear manner with temperature, whereas survival of U.S. flies showed a linear response to temperature. The survivorship curve of the AB flies more closely followed a Weibull distribution than an exponential, indicating survival was age-dependent. Survivorship of the U.S. flies followed an exponential distribution. Females from both sources laid similar numbers of eggs throughout their life. The first eggs were laid by females from both sources at 31.9 degree-day (DD)9.3. Dissections of blood-fed flies reared at various temperatures indicated that flies from both sources were 90% gravid at 32 DD9.3. Relationships among temperature and life history components of vectorial capacity were similar among flies from the two sources and indicated that vectorial capacity would be approximately 1.8-2.6-fold greater in a southern U.S. climate compared with southwestern Canada due solely to the effects of temperature on the life history of C. sonorensis. Using life history estimates derived from Weibull model had little effect on estimating vectorial capacity, whereas using estimates derived from the exponential model slightly overestimated vectorial capacity.
Fatigue stipulation of bulk-fill composites: An in vitro appraisal.
Vidhawan, Shruti A; Yap, Adrian U; Ornaghi, Barbara P; Banas, Agnieszka; Banas, Krzysztof; Neo, Jennifer C; Pfeifer, Carmem S; Rosa, Vinicius
2015-09-01
The aim of this study was to determine the Weibull and slow crack growth (SCG) parameters of bulk-fill resin based composites. The strength degradation over time of the materials was also assessed by strength-probability-time (SPT) analysis. Three bulk-fill [Tetric EvoCeram Bulk Fill (TBF); X-tra fil (XTR); Filtek Bulk-fill flowable (BFL)] and a conventional one [Filtek Z250 (Z250)] were studied. Seventy five disk-shaped specimens (12mm in diameter and 1mm thick) were prepared by inserting the uncured composites in a stainless steel split mold followed by photoactivation (1200mW/cm(2)/20s) and storage in distilled water (37°C/24h). Degree of conversion was evaluated in five specimens by analysis of FT-IR spectra obtained in the mid-IR region. The SCG parameters n (stress corrosion susceptibility coefficient) and σf0 (scaling parameter) were obtained by testing ten specimens in each of the five stress rates: 10(-2), 10(-1), 10(0), 10(1) and 10(2)MPa/s using a piston-on-three-balls device. Weibull parameter m (Weibull modulus) and σf0 (characteristic strength) were obtained by testing additional 20 specimens at 1MPa/s. Strength-probability-time (SPT) diagrams were constructed by merging SCG and Weibull parameters. BFL and TBF presented higher n values, respectively (40.1 and 25.5). Z250 showed the highest (157.02MPa) and TBF the lowest (110.90MPa) σf0 value. Weibull analysis showed m (Weibull modulus) of 9.7, 8.6, 9.7 and 8.9 for TBF, BFL, XTR and Z250, respectively. SPT diagram for 5% probability of failure showed strength decrease of 18% for BFL, 25% for TBF, 32% for XTR and 36% for Z250, respectively, after 5 years as compared to 1 year. The reliability and decadence of strength over time for bulk-fill resin composites studied are, at least, comparable to conventional composites. BFL shows the highest fatigue resistance under all simulations followed by TBF, while XTR was at par with Z250. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.
2007-01-01
Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.
Weissman-Miller, Deborah
2013-11-02
Point estimation is particularly important in predicting weight loss in individuals or small groups. In this analysis, a new health response function is based on a model of human response over time to estimate long-term health outcomes from a change point in short-term linear regression. This important estimation capability is addressed for small groups and single-subject designs in pilot studies for clinical trials, medical and therapeutic clinical practice. These estimations are based on a change point given by parameters derived from short-term participant data in ordinary least squares (OLS) regression. The development of the change point in initial OLS data and the point estimations are given in a new semiparametric ratio estimator (SPRE) model. The new response function is taken as a ratio of two-parameter Weibull distributions times a prior outcome value that steps estimated outcomes forward in time, where the shape and scale parameters are estimated at the change point. The Weibull distributions used in this ratio are derived from a Kelvin model in mechanics taken here to represent human beings. A distinct feature of the SPRE model in this article is that initial treatment response for a small group or a single subject is reflected in long-term response to treatment. This model is applied to weight loss in obesity in a secondary analysis of data from a classic weight loss study, which has been selected due to the dramatic increase in obesity in the United States over the past 20 years. A very small relative error of estimated to test data is shown for obesity treatment with the weight loss medication phentermine or placebo for the test dataset. An application of SPRE in clinical medicine or occupational therapy is to estimate long-term weight loss for a single subject or a small group near the beginning of treatment.
Probabilistic structural analysis of a truss typical for space station
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.
1990-01-01
A three-bay, space, cantilever truss is probabilistically evaluated using the computer code NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) to identify and quantify the uncertainties and respective sensitivities associated with corresponding uncertainties in the primitive variables (structural, material, and loads parameters) that defines the truss. The distribution of each of these primitive variables is described in terms of one of several available distributions such as the Weibull, exponential, normal, log-normal, etc. The cumulative distribution function (CDF's) for the response functions considered and sensitivities associated with the primitive variables for given response are investigated. These sensitivities help in determining the dominating primitive variables for that response.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.
2015-01-26
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which ismore » consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.« less
Lianjun Zhang; Jeffrey H. Gove; Chuangmin Liu; William B. Leak
2001-01-01
The rotated-sigmoid form is a characteristic of old-growth, uneven-aged forest stands caused by past disturbances such as cutting, fire, disease, and insect attacks. The diameter frequency distribution of the rotated-sigmoid form is bimodal with the second rounded peak in the midsized classes, rather than a smooth, steeply descending, monotonic curve. In this study a...
NASA Astrophysics Data System (ADS)
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.
Richard A. Johnson; James W. Evans; David W. Green
2003-01-01
Ratios of strength properties of lumber are commonly used to calculate property values for standards. Although originally proposed in terms of means, ratios are being applied without regard to position in the distribution. It is now known that lumber strength properties are generally not normally distributed. Therefore, nonparametric methods are often used to derive...
The Topp-Leone generalized Rayleigh cure rate model and its application
NASA Astrophysics Data System (ADS)
Nanthaprut, Pimwarat; Bodhisuwan, Winai; Patummasut, Mena
2017-11-01
Cure rate model is one of the survival analysis when model consider a proportion of the censored data. In clinical trials, the data represent time to recurrence of event or death of patients are used to improve the efficiency of treatments. Each dataset can be separated into two groups: censored and uncensored data. In this work, the new mixture cure rate model is introduced based on the Topp-Leone generalized Rayleigh distribution. The Bayesian approach is employed to estimate its parameters. In addition, a breast cancer dataset is analyzed for model illustration purpose. According to the deviance information criterion, the Topp-Leone generalized Rayleigh cure rate model shows better result than the Weibull and exponential cure rate models.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659
Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1998-01-01
Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.
Effect of ion exchange on strength and slow crack growth of a dental porcelain.
Rosa, Vinicius; Yoshimura, Humberto N; Pinto, Marcelo M; Fredericci, Catia; Cesar, Paulo F
2009-06-01
To determine the effect of ion exchange on slow crack growth (SCG) parameters (n, stress corrosion susceptibility coefficient, and sigma(f0), scaling parameter) and Weibull parameters (m, Weibull modulus, and sigma(0), characteristic strength) of a dental porcelain. 160 porcelain discs were fabricated according to manufacturer's instructions, polished through 1 microm and divided into two groups: GC (control) and GI (submitted to an ion exchange procedure using a KNO3 paste at 470 degrees C for 15 min). SCG parameters were determined by biaxial flexural strength test in artificial saliva at 37 degrees C using five constant stress rates (n=10). 20 specimens of each group were tested at 1 MPa/s to determine Weibull parameters. The SPT diagram was constructed using the least-squares fit of the strength data versus probability of failure. Mean values of m and sigma(0) (95% confidence interval), n and sigma(f0) (standard deviation) were, respectively: 13.8 (10.1-18.8) and 60.4 (58.5-62.2), 24.1 (2.5) and 58.1 (0.01) for GC and 7.4 (5.3-10.0) and 136.8 (129.1-144.7), 36.7 (7.3) and 127.9 (0.01) for GI. Fracture stresses (MPa) calculated using the SPT diagram for lifetimes of 1 day, 1 year and 10 years (at a 5% failure probability) were, respectively, 31.8, 24.9 and 22.7 for GC and 71.2, 60.6 and 56.9 for GI. For the porcelain tested, the ion exchange process improved strength and resistance to SCG, however, the material's reliability decreased. The predicted fracture stress at 5% failure probability for a lifetime of 10 years was also higher for the ion treated group.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
Reliability demonstration test for load-sharing systems with exponential and Weibull components
Hu, Qingpei; Yu, Dan; Xie, Min
2017-01-01
Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn’t yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics. PMID:29284030
Reliability demonstration test for load-sharing systems with exponential and Weibull components.
Xu, Jianyu; Hu, Qingpei; Yu, Dan; Xie, Min
2017-01-01
Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.
The effect of mis-specification on mean and selection between the Weibull and lognormal models
NASA Astrophysics Data System (ADS)
Jia, Xiang; Nadarajah, Saralees; Guo, Bo
2018-02-01
The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.
Reliability, failure probability, and strength of resin-based materials for CAD/CAM restorations
Lim, Kiatlin; Yap, Adrian U-Jin; Agarwalla, Shruti Vidhawan; Tan, Keson Beng-Choon; Rosa, Vinicius
2016-01-01
ABSTRACT Objective: This study investigated the Weibull parameters and 5% fracture probability of direct, indirect composites, and CAD/CAM composites. Material and Methods: Discshaped (12 mm diameter x 1 mm thick) specimens were prepared for a direct composite [Z100 (ZO), 3M-ESPE], an indirect laboratory composite [Ceramage (CM), Shofu], and two CAD/CAM composites [Lava Ultimate (LU), 3M ESPE; Vita Enamic (VE), Vita Zahnfabrik] restorations (n=30 for each group). The specimens were polished, stored in distilled water for 24 hours at 37°C. Weibull parameters (m= modulus of Weibull, σ0= characteristic strength) and flexural strength for 5% fracture probability (σ5%) were determined using a piston-on-three-balls device at 1 MPa/s in distilled water. Statistical analysis for biaxial flexural strength analysis were performed either by both one-way ANOVA and Tukey's post hoc (α=0.05) or by Pearson's correlation test. Results: Ranking of m was: VE (19.5), LU (14.5), CM (11.7), and ZO (9.6). Ranking of σ0 (MPa) was: LU (218.1), ZO (210.4), CM (209.0), and VE (126.5). σ5% (MPa) was 177.9 for LU, 163.2 for CM, 154.7 for Z0, and 108.7 for VE. There was no significant difference in the m for ZO, CM, and LU. VE presented the highest m value and significantly higher than ZO. For σ0 and σ5%, ZO, CM, and LU were similar but higher than VE. Conclusion: The strength characteristics of CAD/ CAM composites vary according to their composition and microstructure. VE presented the lowest strength and highest Weibull modulus among the materials. PMID:27812614
Kaur, A; Takhar, P S; Smith, D M; Mann, J E; Brashears, M M
2008-10-01
A fractional differential equations (FDEs)-based theory involving 1- and 2-term equations was developed to predict the nonlinear survival and growth curves of foodborne pathogens. It is interesting to note that the solution of 1-term FDE leads to the Weibull model. Nonlinear regression (Gauss-Newton method) was performed to calculate the parameters of the 1-term and 2-term FDEs. The experimental inactivation data of Salmonella cocktail in ground turkey breast, ground turkey thigh, and pork shoulder; and cocktail of Salmonella, E. coli, and Listeria monocytogenes in ground beef exposed at isothermal cooking conditions of 50 to 66 degrees C were used for validation. To evaluate the performance of 2-term FDE in predicting the growth curves-growth of Salmonella typhimurium, Salmonella Enteritidis, and background flora in ground pork and boneless pork chops; and E. coli O157:H7 in ground beef in the temperature range of 22.2 to 4.4 degrees C were chosen. A program was written in Matlab to predict the model parameters and survival and growth curves. Two-term FDE was more successful in describing the complex shapes of microbial survival and growth curves as compared to the linear and Weibull models. Predicted curves of 2-term FDE had higher magnitudes of R(2) (0.89 to 0.99) and lower magnitudes of root mean square error (0.0182 to 0.5461) for all experimental cases in comparison to the linear and Weibull models. This model was capable of predicting the tails in survival curves, which was not possible using Weibull and linear models. The developed model can be used for other foodborne pathogens in a variety of food products to study the destruction and growth behavior.
HPC simulations of grain-scale spallation to improve thermal spallation drilling
NASA Astrophysics Data System (ADS)
Walsh, S. D.; Lomov, I.; Wideman, T. W.; Potter, J.
2012-12-01
Thermal spallation drilling and related hard-rock hole opening techniques are transformative technologies with the potential to dramatically reduce the costs associated with EGS well drilling and improve the productivity of new and existing wells. In contrast to conventional drilling methods that employ mechanical means to penetrate rock, thermal spallation methods fragment rock into small pieces ("spalls") without contact via the rapid transmission of heat to the rock surface. State-of-the-art constitutive models of thermal spallation employ Weibull statistical failure theory to represent the relationship between rock heterogeneity and its propensity to produce spalls when heat is applied to the rock surface. These models have been successfully used to predict such factors as penetration rate, spall-size distribution and borehole radius from drilling jet velocity and applied heat flux. A properly calibrated Weibull model would permit design optimization of thermal spallation drilling under geothermal field conditions. However, although useful for predicting system response in a given context, Weibull models are by their nature empirically derived. In the past, the parameters used in these models were carefully determined from laboratory tests, and thus model applicability was limited by experimental scope. This becomes problematic, for example, if simulating spall production at depths relevant for geothermal energy production, or modeling thermal spallation drilling in new rock types. Nevertheless, with sufficient computational resources, Weibull models could be validated in the absence of experimental data by explicit small-scale simulations that fully resolve rock grains. This presentation will discuss how high-fidelity simulations can be used to inform Weibull models of thermal spallation, and what these simulations reveal about the processes driving spallation at the grain-scale - in particular, the role that inter-grain boundaries and micro-pores play in the onset and extent of spallation. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Hierarchical statistical modeling of xylem vulnerability to cavitation.
Ogle, Kiona; Barber, Jarrett J; Willson, Cynthia; Thompson, Brenda
2009-01-01
Cavitation of xylem elements diminishes the water transport capacity of plants, and quantifying xylem vulnerability to cavitation is important to understanding plant function. Current approaches to analyzing hydraulic conductivity (K) data to infer vulnerability to cavitation suffer from problems such as the use of potentially unrealistic vulnerability curves, difficulty interpreting parameters in these curves, a statistical framework that ignores sampling design, and an overly simplistic view of uncertainty. This study illustrates how two common curves (exponential-sigmoid and Weibull) can be reparameterized in terms of meaningful parameters: maximum conductivity (k(sat)), water potential (-P) at which percentage loss of conductivity (PLC) =X% (P(X)), and the slope of the PLC curve at P(X) (S(X)), a 'sensitivity' index. We provide a hierarchical Bayesian method for fitting the reparameterized curves to K(H) data. We illustrate the method using data for roots and stems of two populations of Juniperus scopulorum and test for differences in k(sat), P(X), and S(X) between different groups. Two important results emerge from this study. First, the Weibull model is preferred because it produces biologically realistic estimates of PLC near P = 0 MPa. Second, stochastic embolisms contribute an important source of uncertainty that should be included in such analyses.
Development of Testing Methodologies for the Mechanical Properties of MEMS
NASA Technical Reports Server (NTRS)
Ekwaro-Osire, Stephen
2003-01-01
This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.
Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development
1986-10-01
parameter, sample size and fa- tigue test duration. The required input are 1. Residual strength Weibull shape parameter ( ALPR ) 2. Fatigue life Weibull shape...INPUT STRENGTH ALPHA’) READ(*,*) ALPR ALPRI = 1.O/ ALPR WRITE(*, 2) 2 FORMAT( 2X, ’PLEASE INPUT LIFE ALPHA’) READ(*,*) ALPL ALPLI - 1.0/ALPL WRITE(*, 3...3 FORMAT(2X,’PLEASE INPUT SAMPLE SIZE’) READ(*,*) N AN - N WRITE(*,4) 4 FORMAT(2X,’PLEASE INPUT TEST DURATION’) READ(*,*) T RALP - ALPL/ ALPR ARGR - 1
Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A
2015-01-01
This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398
A Monte Carlo Risk Analysis of Life Cycle Cost Prediction.
1975-09-01
process which occurs with each FLU failure. With this in mind there is no alternative other than the binomial distribution. 24 GOR/SM/75D-6 With all of...Weibull distribution of failures as selected by user. For each failure of the ith FLU, the model then samples from the binomial distribution to deter- mine...which is sampled from the binomial . Neither of the two conditions for normality are met, i.e., that RTS Ie close to .5 and the number of samples close
NASA Astrophysics Data System (ADS)
Rikitake, T.
1999-03-01
In light of newly-acquired geophysical information about earthquake generation in the Tokai area, Central Japan, where occurrence of a great earthquake of magnitude 8 or so has recently been feared, probabilities of earthquake occurrence in the near future are reevaluated. Much of the data used for evaluation here relies on recently-developed paleoseismology, tsunami study and GPS geodesy.The new Weibull distribution analysis of recurrence tendency of great earthquakes in the Tokai-Nankai zone indicates that the mean return period of great earthquakes there is estimated as 109 yr with a standard deviation amounting to 33 yr. These values do not differ much from those of previous studies (Rikitake, 1976, 1986; Utsu, 1984).Taking the newly-determined velocities of the motion of Philippine Sea plate at various portions of the Tokai-Nankai zone into account, the ultimate displacements to rupture at the plate boundary are obtained. A Weibull distribution analysis results in the mean ultimate displacement amounting to 4.70 m with a standard deviation estimated as 0.86 m. A return period amounting to 117 yr is obtained at the Suruga Bay portion by dividing the mean ultimate displacement by the relative plate velocity.With the aid of the fault models as determined from the tsunami studies, the increases in the cumulative seismic slips associated with the great earthquakes are examined at various portions of the zone. It appears that a slip-predictable model can better be applied to the occurrence mode of great earthquakes in the zone than a time-predictable model. The crustal strain accumulating over the Tokai area as estimated from the newly-developed geodetic work including the GPS observations is compared to the ultimate strain presumed by the above two models.The probabilities for a great earthquake to recur in the Tokai district are then estimated with the aid of the Weibull analysis parameters obtained for the four cases discussed in the above. All the probabilities evaluated for the four cases take on values ranging 35-45 percent for a ten-year period following the year 2000.
Does artificial aging affect mechanical properties of CAD/CAM composite materials.
Egilmez, Ferhan; Ergun, Gulfem; Cekic-Nagas, Isil; Vallittu, Pekka K; Lassila, Lippo V J
2018-01-01
The purpose of this study was to determine the flexural strength and Weibull characteristics of different CAD/CAM materials after different in vitro aging conditions. The specimens were randomly assigned to one of the six in vitro aging conditions: (1) water storage (37°C, 3 weeks), (2) boiling water (24h), (3) hydrochloric acid exposure (pH: 1.2, 24h), (4) autoclave treatment (134°C, 200kPa, 12h), (5) thermal cycling (5000 times, 5-55°C), (6) cyclic loading (100N, 50,000 cycles). No treatment was applied to the specimens in control group. Three-point bending test was used for the calculation of flexural strength. The reliability of the strength was assessed by Weibull distribution. Surface roughness and topography was examined by coherence scanning interferometry. Evaluated parameters were compared using the Kruskall-Wallis or Mann-Whitney U test. Water storage, autoclave treatment and thermal cycling significantly decreased the flexural strength of all materials (p<0.05), whereas HCl exposure or cyclic loading did not affect the properties (p>0.05). Weibull moduli of Cerasmart™ and Lava™ Ultimate were similar with control. Vita Enamic ® exhibited similar Weibull moduli in all aging groups except the HCl treated group (p>0.05). R a values of Cerasmart™ and Lava™ Ultimate were in the range of 0.053-0.088μm in the aged groups. However R a results of Vita Enamic ® were larger than 0.2μm. Flexural strength of newly developed restorative CAD/CAM materials was significantly decreased by artificial aging. Cyclic loading or HCl exposure does not affect to the flexural strength and structural reliability of Cerasmart™ and Lava™ Ultimate. Copyright © 2017 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Design prediction for long term stress rupture service of composite pressure vessels
NASA Technical Reports Server (NTRS)
Robinson, Ernest Y.
1992-01-01
Extensive stress rupture studies on glass composites and Kevlar composites were conducted by the Lawrence Radiation Laboratory beginning in the late 1960's and extending to about 8 years in some cases. Some of the data from these studies published over the years were incomplete or were tainted by spurious failures, such as grip slippage. Updated data sets were defined for both fiberglass and Kevlar composite stand test specimens. These updated data are analyzed in this report by a convenient form of the bivariate Weibull distribution, to establish a consistent set of design prediction charts that may be used as a conservative basis for predicting the stress rupture life of composite pressure vessels. The updated glass composite data exhibit an invariant Weibull modulus with lifetime. The data are analyzed in terms of homologous service load (referenced to the observed median strength). The equations relating life, homologous load, and probability are given, and corresponding design prediction charts are presented. A similar approach is taken for Kevlar composites, where the updated stand data do show a turndown tendency at long life accompanied by a corresponding change (increase) of the Weibull modulus. The turndown characteristic is not present in stress rupture test data of Kevlar pressure vessels. A modification of the stress rupture equations is presented to incorporate a latent, but limited, strength drop, and design prediction charts are presented that incorporate such behavior. The methods presented utilize Cartesian plots of the probability distributions (which are a more natural display for the design engineer), based on median normalized data that are independent of statistical parameters and are readily defined for any set of test data.
Determination of Turboprop Reduction Gearbox System Fatigue Life and Reliability
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Lewicki, David G.; Savage, Michael; Vlcek, Brian L.
2007-01-01
Two computational models to determine the fatigue life and reliability of a commercial turboprop gearbox are compared with each other and with field data. These models are (1) Monte Carlo simulation of randomly selected lives of individual bearings and gears comprising the system and (2) two-parameter Weibull distribution function for bearings and gears comprising the system using strict-series system reliability to combine the calculated individual component lives in the gearbox. The Monte Carlo simulation included the virtual testing of 744,450 gearboxes. Two sets of field data were obtained from 64 gearboxes that were first-run to removal for cause, were refurbished and placed back in service, and then were second-run until removal for cause. A series of equations were empirically developed from the Monte Carlo simulation to determine the statistical variation in predicted life and Weibull slope as a function of the number of gearboxes failed. The resultant L(sub 10) life from the field data was 5,627 hr. From strict-series system reliability, the predicted L(sub 10) life was 774 hr. From the Monte Carlo simulation, the median value for the L(sub 10) gearbox lives equaled 757 hr. Half of the gearbox L(sub 10) lives will be less than this value and the other half more. The resultant L(sub 10) life of the second-run (refurbished) gearboxes was 1,334 hr. The apparent load-life exponent p for the roller bearings is 5.2. Were the bearing lives to be recalculated with a load-life exponent p equal to 5.2, the predicted L(sub 10) life of the gearbox would be equal to the actual life obtained in the field. The component failure distribution of the gearbox from the Monte Carlo simulation was nearly identical to that using the strict-series system reliability analysis, proving the compatibility of these methods.
Cheung, Li C; Pan, Qing; Hyun, Noorie; Schiffman, Mark; Fetterman, Barbara; Castle, Philip E; Lorey, Thomas; Katki, Hormuzd A
2017-09-30
For cost-effectiveness and efficiency, many large-scale general-purpose cohort studies are being assembled within large health-care providers who use electronic health records. Two key features of such data are that incident disease is interval-censored between irregular visits and there can be pre-existing (prevalent) disease. Because prevalent disease is not always immediately diagnosed, some disease diagnosed at later visits are actually undiagnosed prevalent disease. We consider prevalent disease as a point mass at time zero for clinical applications where there is no interest in time of prevalent disease onset. We demonstrate that the naive Kaplan-Meier cumulative risk estimator underestimates risks at early time points and overestimates later risks. We propose a general family of mixture models for undiagnosed prevalent disease and interval-censored incident disease that we call prevalence-incidence models. Parameters for parametric prevalence-incidence models, such as the logistic regression and Weibull survival (logistic-Weibull) model, are estimated by direct likelihood maximization or by EM algorithm. Non-parametric methods are proposed to calculate cumulative risks for cases without covariates. We compare naive Kaplan-Meier, logistic-Weibull, and non-parametric estimates of cumulative risk in the cervical cancer screening program at Kaiser Permanente Northern California. Kaplan-Meier provided poor estimates while the logistic-Weibull model was a close fit to the non-parametric. Our findings support our use of logistic-Weibull models to develop the risk estimates that underlie current US risk-based cervical cancer screening guidelines. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA.
Automatic Threshold Detector Techniques
1976-07-15
Averaging CFAR in Non- Stationary Weibull Clutter, " L. Novak, (1974 IEEE Symposium on Information Theory ). 8. "The Weibull Distribution Applied to the... UGTS (K) ,Kml NPTS) 140 DO 153 K~lvNPT9 IF(SIGCSO(K) .LT.0. )SIOCSO(K).1 .E-50 IF(SIOWSO(K) .LT.0. )SIGWSQ(K)-1 .E-50 IF(SIONSG(K) .LT.O. )SIG3NSQCIO-1.E
Structural reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.
1991-01-01
For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.
Mechanical Testing of Silicon Carbide on MISSE-7
2012-07-15
JS) ii Abstract Silicon carbide ( SiC ) mechanical test specimens were included on the second Optical and Reflector Materials Experiment (ORMatE II...2. Vendor 2 EFS Weibull Results (normalized to Extra Disks Weibull parameters) 12 1. Introduction Silicon carbide ( SiC ) mechanical test...AEROSPACE REPORT NO ATR-2012(8921)-5 Mechanical Testing of Silicon Carbide on MISSE-7 Jul> 15. 2012 David B. Witkin Space Materials Laboratory
Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos
2003-07-01
To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.
The Age Specific Incidence Anomaly Suggests that Cancers Originate During Development
NASA Astrophysics Data System (ADS)
Brody, James P.
The accumulation of genetic alterations causes cancers. Since this accumulation takes time, the incidence of most cancers is thought to increase exponentially with age. However, careful measurements of the age-specific incidence show that the specific incidence for many forms of cancer rises with age to a maximum, and then decreases. This decrease in the age-specific incidence with age is an anomaly. Understanding this anomaly should lead to a better understanding of how tumors develop and grow. Here we derive the shape of the age-specific incidence, showing that it should follow the shape of a Weibull distribution. Measurements indicate that the age-specific incidence for colon cancer does indeed follow a Weibull distribution. This analysis leads to the interpretation that for colon cancer two subpopulations exist in the general population: a susceptible population and an immune population. Colon tumors will only occur in the susceptible population. This analysis is consistent with the developmental origins of disease hypothesis and generalizable to many other common forms of cancer.
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
2004-01-01
Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal distribution (ref.1). This new approach allows for a very simple and direct algebraic solution without restricting the standard deviation. The beta parameters obtained by the new method are comparable to the conventional method (and identical when the distribution is symmetrical). However, the proposed method generally produces a less peaked distribution with a slightly larger standard deviation (up to 7 percent) than the conventional method in cases where the distribution is asymmetric or skewed. The beta distribution model has now been implemented into the Fast Probability Integration (FPI) module used in the NESSUS computer code for probabilistic analyses of structures (ref. 2).
Arreyndip, Nkongho Ayuketang; Joseph, Ebobenow; David, Afungchui
2016-11-01
For the future installation of a wind farm in Cameroon, the wind energy potentials of three of Cameroon's coastal cities (Kribi, Douala and Limbe) are assessed using NASA average monthly wind data for 31 years (1983-2013) and compared through Weibull statistics. The Weibull parameters are estimated by the method of maximum likelihood, the mean power densities, the maximum energy carrying wind speeds and the most probable wind speeds are also calculated and compared over these three cities. Finally, the cumulative wind speed distributions over the wet and dry seasons are also analyzed. The results show that the shape and scale parameters for Kribi, Douala and Limbe are 2.9 and 2.8, 3.9 and 1.8 and 3.08 and 2.58, respectively. The mean power densities through Weibull analysis for Kribi, Douala and Limbe are 33.7 W/m2, 8.0 W/m2 and 25.42 W/m2, respectively. Kribi's most probable wind speed and maximum energy carrying wind speed was found to be 2.42 m/s and 3.35 m/s, 2.27 m/s and 3.03 m/s for Limbe and 1.67 m/s and 2.0 m/s for Douala, respectively. Analysis of the wind speed and hence power distribution over the wet and dry seasons shows that in the wet season, August is the windiest month for Douala and Limbe while September is the windiest month for Kribi while in the dry season, March is the windiest month for Douala and Limbe while February is the windiest month for Kribi. In terms of mean power density, most probable wind speed and wind speed carrying maximum energy, Kribi shows to be the best site for the installation of a wind farm. Generally, the wind speeds at all three locations seem quite low, average wind speeds of all the three studied locations fall below 4.0m/s which is far below the cut-in wind speed of many modern wind turbines. However we recommend the use of low cut-in speed wind turbines like the Savonius for stand alone low energy needs.
CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES
NASA Technical Reports Server (NTRS)
Nemeth, N. N.
1994-01-01
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental results. For comparison, Griffith's maximum tensile stress theory, the principle of independent action, and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. A more limited program, CARES/PC (COSMIC number LEW-15248) runs on a personal computer and estimates ceramic material properties from three-point bend bar data. CARES/PC does not perform fast fracture reliability estimation. CARES is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and on IBM 370 series computers under VM/CMS. On a VAX, CARES requires 10Mb of main memory. Five MSC/NASTRAN example problems and two ANSYS example problems are provided. There are two versions of CARES supplied on the distribution tape, CARES1 and CARES2. CARES2 contains sub-elements and CARES1 does not. CARES is available on a 9-track 1600 BPI VAX FILES-11 format magnetic tape (standard media) or in VAX BACKUP format on a TK50 tape cartridge. The program requires a FORTRAN 77 compiler and about 12Mb memory. CARES was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. IBM 370 is a trademark of International Business Machines. MSC/NASTRAN is a trademark of MacNeal-Schwendler Corporation. ANSYS is a trademark of Swanson Analysis Systems, Inc.
NASA Astrophysics Data System (ADS)
Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah
2014-11-01
A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.
Normal and Extreme Wind Conditions for Power at Coastal Locations in China
Gao, Meng; Ning, Jicai; Wu, Xiaoqing
2015-01-01
In this paper, the normal and extreme wind conditions for power at 12 coastal locations along China’s coastline were investigated. For this purpose, the daily meteorological data measured at the standard 10-m height above ground for periods of 40–62 years are statistically analyzed. The East Asian Monsoon that affects almost China’s entire coastal region is considered as the leading factor determining wind energy resources. For most stations, the mean wind speed is higher in winter and lower in summer. Meanwhile, the wind direction analysis indicates that the prevalent winds in summer are southerly, while those in winter are northerly. The air densities at different coastal locations differ significantly, resulting in the difference in wind power density. The Weibull and lognormal distributions are applied to fit the yearly wind speeds. The lognormal distribution performs better than the Weibull distribution at 8 coastal stations according to two judgement criteria, the Kolmogorov–Smirnov test and absolute error (AE). Regarding the annual maximum extreme wind speed, the generalized extreme value (GEV) distribution performs better than the commonly-used Gumbel distribution. At these southeastern coastal locations, strong winds usually occur in typhoon season. These 4 coastal provinces, that is, Guangdong, Fujian, Hainan, and Zhejiang, which have abundant wind resources, are also prone to typhoon disasters. PMID:26313256
Normal and Extreme Wind Conditions for Power at Coastal Locations in China.
Gao, Meng; Ning, Jicai; Wu, Xiaoqing
2015-01-01
In this paper, the normal and extreme wind conditions for power at 12 coastal locations along China's coastline were investigated. For this purpose, the daily meteorological data measured at the standard 10-m height above ground for periods of 40-62 years are statistically analyzed. The East Asian Monsoon that affects almost China's entire coastal region is considered as the leading factor determining wind energy resources. For most stations, the mean wind speed is higher in winter and lower in summer. Meanwhile, the wind direction analysis indicates that the prevalent winds in summer are southerly, while those in winter are northerly. The air densities at different coastal locations differ significantly, resulting in the difference in wind power density. The Weibull and lognormal distributions are applied to fit the yearly wind speeds. The lognormal distribution performs better than the Weibull distribution at 8 coastal stations according to two judgement criteria, the Kolmogorov-Smirnov test and absolute error (AE). Regarding the annual maximum extreme wind speed, the generalized extreme value (GEV) distribution performs better than the commonly-used Gumbel distribution. At these southeastern coastal locations, strong winds usually occur in typhoon season. These 4 coastal provinces, that is, Guangdong, Fujian, Hainan, and Zhejiang, which have abundant wind resources, are also prone to typhoon disasters.
Statistical properties of world investment networks
NASA Astrophysics Data System (ADS)
Song, Dong-Ming; Jiang, Zhi-Qiang; Zhou, Wei-Xing
2009-06-01
We have performed a detailed investigation on the world investment networks constructed from the Coordinated Portfolio Investment Survey (CPIS) data of the International Monetary Fund, ranging from 2001 to 2006. The distributions of degrees and node strengths are scale-free. The weight distributions can be well modeled by the Weibull distribution. The maximum flow spanning trees of the world investment networks possess two universal allometric scaling relations, independent of time and the investment type. The topological scaling exponent is 1.17±0.02 and the flow scaling exponent is 1.03±0.01.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis and Modeling of Realistic Compound Channels in Transparent Relay Transmissions
Kanjirathumkal, Cibile K.; Mohammed, Sameer S.
2014-01-01
Analytical approaches for the characterisation of the compound channels in transparent multihop relay transmissions over independent fading channels are considered in this paper. Compound channels with homogeneous links are considered first. Using Mellin transform technique, exact expressions are derived for the moments of cascaded Weibull distributions. Subsequently, two performance metrics, namely, coefficient of variation and amount of fade, are derived using the computed moments. These metrics quantify the possible variations in the channel gain and signal to noise ratio from their respective average values and can be used to characterise the achievable receiver performance. This approach is suitable for analysing more realistic compound channel models for scattering density variations of the environment, experienced in multihop relay transmissions. The performance metrics for such heterogeneous compound channels having distinct distribution in each hop are computed and compared with those having identical constituent component distributions. The moments and the coefficient of variation computed are then used to develop computationally efficient estimators for the distribution parameters and the optimal hop count. The metrics and estimators proposed are complemented with numerical and simulation results to demonstrate the impact of the accuracy of the approaches. PMID:24701175
2012-01-01
Background The goals of our study are to determine the most appropriate model for alcohol consumption as an exposure for burden of disease, to analyze the effect of the chosen alcohol consumption distribution on the estimation of the alcohol Population- Attributable Fractions (PAFs), and to characterize the chosen alcohol consumption distribution by exploring if there is a global relationship within the distribution. Methods To identify the best model, the Log-Normal, Gamma, and Weibull prevalence distributions were examined using data from 41 surveys from Gender, Alcohol and Culture: An International Study (GENACIS) and from the European Comparative Alcohol Study. To assess the effect of these distributions on the estimated alcohol PAFs, we calculated the alcohol PAF for diabetes, breast cancer, and pancreatitis using the three above-named distributions and using the more traditional approach based on categories. The relationship between the mean and the standard deviation from the Gamma distribution was estimated using data from 851 datasets for 66 countries from GENACIS and from the STEPwise approach to Surveillance from the World Health Organization. Results The Log-Normal distribution provided a poor fit for the survey data, with Gamma and Weibull distributions providing better fits. Additionally, our analyses showed that there were no marked differences for the alcohol PAF estimates based on the Gamma or Weibull distributions compared to PAFs based on categorical alcohol consumption estimates. The standard deviation of the alcohol distribution was highly dependent on the mean, with a unit increase in alcohol consumption associated with a unit increase in the mean of 1.258 (95% CI: 1.223 to 1.293) (R2 = 0.9207) for women and 1.171 (95% CI: 1.144 to 1.197) (R2 = 0. 9474) for men. Conclusions Although the Gamma distribution and the Weibull distribution provided similar results, the Gamma distribution is recommended to model alcohol consumption from population surveys due to its fit, flexibility, and the ease with which it can be modified. The results showed that a large degree of variance of the standard deviation of the alcohol consumption Gamma distribution was explained by the mean alcohol consumption, allowing for alcohol consumption to be modeled through a Gamma distribution using only average consumption. PMID:22490226
NASA Astrophysics Data System (ADS)
Sardet, Laure; Patilea, Valentin
When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.
Accounting for inherent variability of growth in microbial risk assessment.
Marks, H M; Coleman, M E
2005-04-15
Risk assessments of pathogens need to account for the growth of small number of cells under varying conditions. In order to determine the possible risks that occur when there are small numbers of cells, stochastic models of growth are needed that would capture the distribution of the number of cells over replicate trials of the same scenario or environmental conditions. This paper provides a simple stochastic growth model, accounting only for inherent cell-growth variability, assuming constant growth kinetic parameters, for an initial, small, numbers of cells assumed to be transforming from a stationary to an exponential phase. Two, basic, microbial sets of assumptions are considered: serial, where it is assume that cells transform through a lag phase before entering the exponential phase of growth; and parallel, where it is assumed that lag and exponential phases develop in parallel. The model is based on, first determining the distribution of the time when growth commences, and then modelling the conditional distribution of the number of cells. For the latter distribution, it is found that a Weibull distribution provides a simple approximation to the conditional distribution of the relative growth, so that the model developed in this paper can be easily implemented in risk assessments using commercial software packages.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Saucedo-Reyes, Daniela; Carrillo-Salazar, José A; Román-Padilla, Lizbeth; Saucedo-Veloz, Crescenciano; Reyes-Santamaría, María I; Ramírez-Gilly, Mariana; Tecante, Alberto
2018-03-01
High hydrostatic pressure inactivation kinetics of Escherichia coli ATCC 25922 and Salmonella enterica subsp. enterica serovar Typhimurium ATCC 14028 ( S. typhimurium) in a low acid mamey pulp at four pressure levels (300, 350, 400, and 450 MPa), different exposure times (0-8 min), and temperature of 25 ± 2℃ were obtained. Survival curves showed deviations from linearity in the form of a tail (upward concavity). The primary models tested were the Weibull model, the modified Gompertz equation, and the biphasic model. The Weibull model gave the best goodness of fit ( R 2 adj > 0.956, root mean square error < 0.290) in the modeling and the lowest Akaike information criterion value. Exponential-logistic and exponential decay models, and Bigelow-type and an empirical models for b'( P) and n( P) parameters, respectively, were tested as alternative secondary models. The process validation considered the two- and one-step nonlinear regressions for making predictions of the survival fraction; both regression types provided an adequate goodness of fit and the one-step nonlinear regression clearly reduced fitting errors. The best candidate model according to the Akaike theory information, with better accuracy and more reliable predictions was the Weibull model integrated by the exponential-logistic and exponential decay secondary models as a function of time and pressure (two-step procedure) or incorporated as one equation (one-step procedure). Both mathematical expressions were used to determine the t d parameter, where the desired reductions ( 5D) (considering d = 5 ( t 5 ) as the criterion of 5 Log 10 reduction (5 D)) in both microorganisms are attainable at 400 MPa for 5.487 ± 0.488 or 5.950 ± 0.329 min, respectively, for the one- or two-step nonlinear procedure.
NASA Astrophysics Data System (ADS)
Shah, Nita H.; Soni, Hardik N.; Gupta, Jyoti
2014-08-01
In a recent paper, Begum et al. (2012, International Journal of Systems Science, 43, 903-910) established pricing and replenishment policy for an inventory system with price-sensitive demand rate, time-proportional deterioration rate which follows three parameters, Weibull distribution and no shortages. In their model formulation, it is observed that the retailer's stock level reaches zero before the deterioration occurs. Consequently, the model resulted in traditional inventory model with price sensitive demand rate and no shortages. Hence, the main purpose of this note is to modify and present complete model formulation for Begum et al. (2012). The proposed model is validated by a numerical example and the sensitivity analysis of parameters is carried out.
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.
2002-01-01
A generalized reliability model was developed for use in the design of structural components made from brittle, homogeneous anisotropic materials such as single crystals. The model is based on the Weibull distribution and incorporates a variable strength distribution and any equivalent stress failure criteria. In addition to the reliability model, an energy based failure criterion for elastically anisotropic materials was formulated. The model is different from typical Weibull-based models in that it accounts for strength anisotropy arising from fracture toughness anisotropy and thereby allows for strength and reliability predictions of brittle, anisotropic single crystals subjected to multiaxial stresses. The model is also applicable to elastically isotropic materials exhibiting strength anisotropy due to an anisotropic distribution of flaws. In order to develop and experimentally verify the model, the uniaxial and biaxial strengths of a single crystal nickel aluminide were measured. The uniaxial strengths of the <100> and <110> crystal directions were measured in three and four-point flexure. The biaxial strength was measured by subjecting <100> plates to a uniform pressure in a test apparatus that was developed and experimentally verified. The biaxial strengths of the single crystal plates were estimated by extending and verifying the displacement solution for a circular, anisotropic plate to the case of a variable radius and thickness. The best correlation between the experimental strength data and the model predictions occurred when an anisotropic stress analysis was combined with the normal stress criterion and the strength parameters associated with the <110> crystal direction.
Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India
NASA Astrophysics Data System (ADS)
Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.
2014-09-01
The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.
Effects of Planetary Gear Ratio on Mean Service Life
NASA Technical Reports Server (NTRS)
Savage, M.; Rubadeux, K. L.; Coe, H. H.
1996-01-01
Planetary gear transmissions are compact, high-power speed reductions which use parallel load paths. The range of possible reduction ratios is bounded from below and above by limits on the relative size of the planet gears. For a single plane transmission, the planet gear has no size at a ratio of two. As the ratio increases, so does the size of the planets relative to the sizes of the sun and ring. Which ratio is best for a planetary reduction can be resolved by studying a series of optimal designs. In this series, each design is obtained by maximizing the service life for a planetary with a fixed size, gear ratio, input speed power and materials. The planetary gear reduction service life is modeled as a function of the two-parameter Weibull distributed service lives of the bearings and gears in the reduction. Planet bearing life strongly influences the optimal reduction lives which point to an optimal planetary reduction ratio in the neighborhood of four to five.
Performance of mixed RF/FSO systems in exponentiated Weibull distributed channels
NASA Astrophysics Data System (ADS)
Zhao, Jing; Zhao, Shang-Hong; Zhao, Wei-Hu; Liu, Yun; Li, Xuan
2017-12-01
This paper presented the performances of asymmetric mixed radio frequency (RF)/free-space optical (FSO) system with the amplify-and-forward relaying scheme. The RF channel undergoes Nakagami- m channel, and the Exponentiated Weibull distribution is adopted for the FSO component. The mathematical formulas for cumulative distribution function (CDF), probability density function (PDF) and moment generating function (MGF) of equivalent signal-to-noise ratio (SNR) are achieved. According to the end-to-end statistical characteristics, the new analytical expressions of outage probability are obtained. Under various modulation techniques, we derive the average bit-error-rate (BER) based on the Meijer's G function. The evaluation and simulation are provided for the system performance, and the aperture average effect is discussed as well.
An application of synthetic seismicity in earthquake statistics - The Middle America Trench
NASA Technical Reports Server (NTRS)
Ward, Steven N.
1992-01-01
The way in which seismicity calculations which are based on the concept of fault segmentation incorporate the physics of faulting through static dislocation theory can improve earthquake recurrence statistics and hone the probabilities of hazard is shown. For the Middle America Trench, the spread parameters of the best-fitting lognormal or Weibull distributions (about 0.75) are much larger than the 0.21 intrinsic spread proposed in the Nishenko Buland (1987) hypothesis. Stress interaction between fault segments disrupts time or slip predictability and causes earthquake recurrence to be far more aperiodic than has been suggested.
NASA Astrophysics Data System (ADS)
Caputo, Riccardo
2010-09-01
It is a commonplace field observation that extension fractures are more abundant than shear fractures. The questions of how much more abundant, and why, are posed in this paper and qualitative estimates of their ratio within a rock volume are made on the basis of field observations and mechanical considerations. A conceptual model is also proposed to explain the common range of ratios between extension and shear fractures, here called the j/ f ratio. The model considers three major genetic stress components originated from overburden, pore-fluid pressure and tectonics and assumes that some of the remote genetic stress components vary with time ( i.e. stress-rates are included). Other important assumptions of the numerical model are that: i) the strength of the sub-volumes is randomly attributed following a Weibull probabilistic distribution, ii) all fractures heal after a given time, thus simulating the cementation process, and therefore iii) both extensional jointing and shear fracturing could be recurrent events within the same sub-volume. As a direct consequence of these assumptions, the stress tensor at any point varies continuously in time and these variations are caused by both remote stresses and local stress drops associated with in-situ and neighbouring fracturing events. The conceptual model is implemented in a computer program to simulate layered carbonate rock bodies undergoing brittle deformation. The numerical results are obtained by varying the principal parameters, like depth ( viz. confining pressure), tensile strength, pore-fluid pressure and shape of the Weibull distribution function, in a wide range of values, therefore simulating a broad spectrum of possible mechanical and lithological conditions. The quantitative estimates of the j/ f ratio confirm the general predominance of extensional failure events during brittle deformation in shallow crustal rocks and provide useful insights for better understanding the role played by the different parameters. For example, as a general trend it is observed that the j/ f ratio is inversely proportional to depth ( viz. confining pressure) and directly proportional to pore-fluid pressure, while the stronger is the rock, the wider is the range of depths showing a finite value of the j/ f ratio and in general the deeper are the conditions where extension fractures can form. Moreover, the wider is the strength variability of rocks ( i.e. the lower is the m parameter of the Weibull probabilistic distribution function), the wider is the depth range where both fractures can form providing a finite value of the j/ f ratio. Natural case studies from different geological and tectonic settings are also used to test the conceptual model and the numerical results showing a good agreement between measured and predicted j/ f ratios.
Upadhyay, S K; Mukherjee, Bhaswati; Gupta, Ashutosh
2009-09-01
Several models for studies related to tensile strength of materials are proposed in the literature where the size or length component has been taken to be an important factor for studying the specimens' failure behaviour. An important model, developed on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum-Saunders fatigue model that incorporates size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation.
Hirose, H
1997-01-01
This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.
de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf
2013-03-01
The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. Copyright © 2012 Elsevier Ltd. All rights reserved.
Reliability of High-Voltage Tantalum Capacitors. Parts 3 and 4)
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
What are the Shapes of Response Time Distributions in Visual Search?
Palmer, Evan M.; Horowitz, Todd S.; Torralba, Antonio; Wolfe, Jeremy M.
2011-01-01
Many visual search experiments measure reaction time (RT) as their primary dependent variable. Analyses typically focus on mean (or median) RT. However, given enough data, the RT distribution can be a rich source of information. For this paper, we collected about 500 trials per cell per observer for both target-present and target-absent displays in each of three classic search tasks: feature search, with the target defined by color; conjunction search, with the target defined by both color and orientation; and spatial configuration search for a 2 among distractor 5s. This large data set allows us to characterize the RT distributions in detail. We present the raw RT distributions and fit several psychologically motivated functions (ex-Gaussian, ex-Wald, Gamma, and Weibull) to the data. We analyze and interpret parameter trends from these four functions within the context of theories of visual search. PMID:21090905
Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K
2016-01-01
Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cianciara, Aleksander
2016-09-01
The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.
Vector wind profile gust model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1981-01-01
To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.
Mortality profiles of Rhodnius prolixus (Heteroptera: Reduviidae), vector of Chagas disease.
Chaves, Luis Fernando; Hernandez, Maria-Josefina; Revilla, Tomás A; Rodríguez, Diego J; Rabinovich, Jorge E
2004-10-01
Life table data of Rhodnius prolixus (Heteroptera: Reduviidae) kept at laboratory conditions were analysed in search for mortality patterns. Gompertz and Weibull mortality models seem adequate to explain the sigmoid shape of the survivorship curve. A significant fit was obtained with both models for females (R(2) = 0.70, P < 0.0005 for the Gompertz model; R(2) = 0.78, P < 0.0005 for the Weibull model) and for males (R(2) = 0.39, P < 0.0005 for the Gompertz model; R(2) = 0.48, P < 0.0005 for the Weibull model). The mortality parameter (b) is higher for females in Gompertz and Weibull models, using smoothed and non-smoothed data (P < 0.05), revealing a significant sex mortality differential. Given the particular life history of this insect, the non-linear relationship between the force of mortality and age may have an important impact in the vectorial capacity of R. prolixus as Chagas disease vector, and its consideration should be included as an important factor in the transmission of Trypanosoma cruzi by triatomines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.
1997-04-01
Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less
ZERODUR®: new stress corrosion data improve strength fatigue prediction
NASA Astrophysics Data System (ADS)
Hartmann, Peter; Kleer, Günter; Rist, Tobias
2015-09-01
The extremely low thermal expansion glass ceramic ZERODUR® finds more and more applications as sophisticated light weight structures with thin ribs or as thin shells. Quite often they will be subject to higher mechanical loads such as rocket launches or modulating wobbling vibrations. Designing such structures requires calculation methods and data taking into account their long term fatigue. With brittle materials fatigue is not only given by the material itself but to a high extent also by its surface condition and the environmental media especially humidity. This work extends the latest data and information gathered on the bending strength of ZERODUR® with new results concerning its long term behavior under tensile stress. The parameter needed for prediction calculations which combines the influences of time and environmental media is the stress corrosion constant n. Results of the past differ significantly from each other. In order to obtain consistent data the stress corrosion constant has been measured with the method comparing the breakage statistical distributions at different stress increase rates. For better significance the stress increase rate was varied over four orders of magnitude from 0.004 MPa/s to 40 MPa/s. Experiments were performed under normal humidity for long term earth bound applications and under nitrogen atmosphere as equivalent to dry environment occurring for example with telescopes in deserts and also equivalent to vacuum for space applications. As shown earlier the bending strength of diamond ground surfaces of ZERODUR® can be represented with a three parameter Weibull distribution. Predictions on the long term strength change of ZERODUR® structures under tensile stress are possible with reduced uncertainty if Weibull threshold strength values are considered and more reliable stress corrosion constant data are applied.
CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Bandopadhyay; N. Nagabhushana
2003-10-01
Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably wellmore » developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.« less
NASA Astrophysics Data System (ADS)
Wang, Xiaohua
The coupling resulting from the mutual influence of material thermal and mechanical parameters is examined in the thermal stress analysis of a multilayered isotropic composite cylinder subjected to sudden axisymmetric external and internal temperature. The method of complex frequency response functions together with the Fourier transform technique is utilized. Because the coupling parameters for some composite materials, such as carbon-carbon, are very small, the effect of coupling is neglected in the orthotropic thermal stress analysis. The stress distributions in multilayered orthotropic cylinders subjected to sudden axisymmetric temperature loading combined with dynamic pressure as well as asymmetric temperature loading are also obtained. The method of Fourier series together with the Laplace transform is utilized in solving the heat conduction equation and thermal stress analysis. For brittle materials, like carbon-carbon composites, the strength variability is represented by two or three parameter Weibull distributions. The 'weakest link' principle which takes into account both the carbon-carbon composite cylinders. The complex frequency response analysis is performed on a multilayered orthotropic cylinder under asymmetrical thermal load. Both deterministic and random thermal stress and reliability analyses can be based on the results of this frequency response analysis. The stress and displacement distributions and reliability of rocket motors under static or dynamic line loads are analyzed by an elasticity approach. Rocket motors are modeled as long hollow multilayered cylinders with an air core, a thick isotropic propellant inner layer and a thin orthotropic kevlar-epoxy case. The case is treated as a single orthotropic layer or a ten layered orthotropic structure. Five material properties and the load are treated as random variable with normal distributions when the reliability of the rocket motor is analyzed by the first-order, second-moment method (FOSM).
Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang
2014-08-25
The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems.
Weibull-Based Design Methodology for Rotating Aircraft Engine Structures
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry
2002-01-01
The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.
Siarampi, Eleni; Kontonasaki, Eleana; Andrikopoulos, Konstantinos S; Kantiranis, Nikolaos; Voyiatzis, George A; Zorba, Triantafillia; Paraskevopoulos, Konstantinos M; Koidis, Petros
2014-12-01
Dental zirconia restorations should present long-term clinical survival and be in service within the oral environment for many years. However, low temperature degradation could affect their mechanical properties and survival. The aim of this study was to investigate the effect of in vitro aging on the flexural strength of yttrium-stabilized (Y-TZP) zirconia ceramics for ceramic restorations. One hundred twenty bar-shaped specimens were prepared from two ceramics (ZENO Zr (WI) and IPS e.max(®) ZirCAD (IV)), and loaded until fracture according to ISO 6872. The specimens from each ceramic (nx=60) were divided in three groups (control, aged for 5h, aged for 10h). One-way ANOVA was used to assess statistically significant differences among flexural strength values (P<0.05). The variability of the flexural strength values was analyzed using the two-parameter Weibull distribution function, which was applied for the estimation of Weibull modulus (m) and characteristic strength (σ0). The crystalline phase polymorphs of the materials (tetragonal, t, and monoclinic, m, zirconia) were investigated by X-ray diffraction (XRD) analysis, Raman spectroscopy and Fourier transform infrared (FTIR) spectroscopy. A slight increase of the flexural strength after 5h, and a decrease after 10h of aging, was recorded for both ceramics, however statistically significant was for the WI group (P<0.05). Both ceramics presented a t→m phase transformation, with the m-phase increasing from 4 to 5% at 5h to around 15% after 10h. The significant reduction of the flexural strength after 10h of in vitro aging, suggests high fracture probability for one of the zirconia ceramics tested. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
End-of-life flows of multiple cycle consumer products.
Tsiliyannis, C A
2011-11-01
Explicit expressions for the end-of-life flows (EOL) of single and multiple cycle products (MCPs) are presented, including deterministic and stochastic EOL exit. The expressions are given in terms of the physical parameters (maximum lifetime, T, annual cycling frequency, f, number of cycles, N, and early discard or usage loss). EOL flows are also obtained for hi-tech products, which are rapidly renewed and thus may not attain steady state (e.g., electronic products, passenger cars). A ten-step recursive procedure for obtaining the dynamic EOL flow evolution is proposed. Applications of the EOL expressions and the ten-step procedure are given for electric household appliances, industrial machinery, tyres, vehicles and buildings, both for deterministic and stochastic EOL exit, (normal, Weibull and uniform exit distributions). The effect of the physical parameters and the stochastic characteristics on the EOL flow is investigated in the examples: it is shown that the EOL flow profile is determined primarily by the early discard dynamics; it also depends strongly on longevity and cycling frequency: higher lifetime or early discard/loss imply lower dynamic and steady state EOL flows. The stochastic exit shapes the overall EOL dynamic profile: Under symmetric EOL exit distribution, as the variance of the distribution increases (uniform to normal to deterministic) the initial EOL flow rise becomes steeper but the steady state or maximum EOL flow level is lower. The steepest EOL flow profile, featuring the highest steady state or maximum level, as well, corresponds to skew, earlier shifted EOL exit (e.g., Weibull). Since the EOL flow of returned products consists the sink of the reuse/remanufacturing cycle (sink to recycle) the results may be used in closed loop product lifecycle management operations for scheduling and sizing reverse manufacturing and for planning recycle logistics. Decoupling and quantification of both the full age EOL and of the early discard flows is useful, the latter being the target of enacted legislation aiming at increasing reuse. Copyright © 2011 Elsevier Ltd. All rights reserved.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2008-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Failure rate and reliability of the KOMATSU hydraulic excavator in surface limestone mine
NASA Astrophysics Data System (ADS)
Harish Kumar N., S.; Choudhary, R. P.; Murthy, Ch. S. N.
2018-04-01
The model with failure rate function of bathtub-shaped is helpful in reliability analysis of any system and particularly in reliability associated privative maintenance. The usual Weibull distribution is, however, not capable to model the complete lifecycle of the any with a bathtub-shaped failure rate function. In this paper, failure rate and reliability analysis of the KOMATSU hydraulic excavator/shovel in surface mine is presented and also to improve the reliability and decrease the failure rate of each subsystem of the shovel based on the preventive maintenance. The model of the bathtub-shaped for shovel can also be seen as a simplification of the Weibull distribution.
Estimating child mortality and modelling its age pattern for India.
Roy, S G
1989-06-01
"Using data [for India] on proportions of children dead...estimates of infant and child mortality are...obtained by Sullivan and Trussell modifications of [the] Brass basic method. The estimate of child survivorship function derived after logit smoothing appears to be more reliable than that obtained by the Census Actuary. The age pattern of childhood mortality is suitably modelled by [a] Weibull function defining the probability of surviving from birth to a specified age and involving two parameters of level and shape. A recently developed linearization procedure based on [a] graphical approach is adopted for estimating the parameters of the function." excerpt
NASA Astrophysics Data System (ADS)
Huang, D.; Liu, Y.
2014-12-01
The effects of subgrid cloud variability on grid-average microphysical rates and radiative fluxes are examined by use of long-term retrieval products at the Tropical West Pacific (TWP), Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites of the Department of Energy's Atmospheric Radiation Measurement (ARM) Program. Four commonly used distribution functions, the truncated Gaussian, Gamma, lognormal, and Weibull distributions, are constrained to have the same mean and standard deviation as observed cloud liquid water content. The PDFs are then used to upscale relevant physical processes to obtain grid-average process rates. It is found that the truncated Gaussian representation results in up to 30% mean bias in autoconversion rate whereas the mean bias for the lognormal representation is about 10%. The Gamma and Weibull distribution function performs the best for the grid-average autoconversion rate with the mean relative bias less than 5%. For radiative fluxes, the lognormal and truncated Gaussian representations perform better than the Gamma and Weibull representations. The results show that the optimal choice of subgrid cloud distribution function depends on the nonlinearity of the process of interest and thus there is no single distribution function that works best for all parameterizations. Examination of the scale (window size) dependence of the mean bias indicates that the bias in grid-average process rates monotonically increases with increasing window sizes, suggesting the increasing importance of subgrid variability with increasing grid sizes.
Failure rate analysis of Goddard Space Flight Center spacecraft performance during orbital life
NASA Technical Reports Server (NTRS)
Norris, H. P.; Timmins, A. R.
1976-01-01
Space life performance data on 57 Goddard Space Flight Center spacecraft are analyzed from the standpoint of determining an appropriate reliability model and the associated reliability parameters. Data from published NASA reports, which cover the space performance of GSFC spacecraft launched in the 1960-1970 decade, form the basis of the analyses. The results of the analyses show that the time distribution of 449 malfunctions, of which 248 were classified as failures (not necessarily catastrophic), follow a reliability growth pattern that can be described with either the Duane model or a Weibull distribution. The advantages of both mathematical models are used in order to: identify space failure rates, observe chronological trends, and compare failure rates with those experienced during the prelaunch environmental tests of the flight model spacecraft.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, H. C.; Wimmer, J. M.; Huang, H. H.; Rorabaugh, M. E.; Schienle, J.; Styhr, K. H.
1985-01-01
The AiResearch Casting Company baseline silicon nitride (92 percent GTE SN-502 Si sub 3 N sub 4 plus 6 percent Y sub 2 O sub 3 plus 2 percent Al sub 2 O sub 3) was characterized with methods that included chemical analysis, oxygen content determination, electrophoresis, particle size distribution analysis, surface area determination, and analysis of the degree of agglomeration and maximum particle size of elutriated powder. Test bars were injection molded and processed through sintering at 0.68 MPa (100 psi) of nitrogen. The as-sintered test bars were evaluated by X-ray phase analysis, room and elevated temperature modulus of rupture strength, Weibull modulus, stress rupture, strength after oxidation, fracture origins, microstructure, and density from quantities of samples sufficiently large to generate statistically valid results. A series of small test matrices were conducted to study the effects and interactions of processing parameters which included raw materials, binder systems, binder removal cycles, injection molding temperatures, particle size distribution, sintering additives, and sintering cycle parameters.
Correction to the Dynamic Tensile Strength of Ice and Ice-Silicate Mixtures (Lange & Ahrens 1983)
NASA Astrophysics Data System (ADS)
Stewart, S. T.; Ahrens, T. J.
1999-03-01
We present a correction to the Weibull parameters for ice and ice-silicate mixtures (Lange & Ahrens 1983). These parameters relate the dynamic tensile strength to the strain rate. These data are useful for continuum fracture models of ice.
Failure probability of three designs of zirconia crowns
Ramos, G. Freitas; Monteiro, E. Barbosa Carmona; Bottino, M.A.; Zhang, Y.; de Melo, R. Marques
2015-01-01
Objectives This study utilized a 2-parameter Weibull analysis for evaluation of lifetime of fully or partially porcelain-/glaze-veneered zirconia crowns after fatigue test. Methods Sixty first molars were selected and prepared for full-coverage crowns with three different designs(n = 20): Traditional –crowns with zirconia framework covered with feldspathic porcelain; Modified– crowns partially covered with veneering porcelain; and Monolithic–full-contour zirconia crowns. All specimens were treated with a glaze layer. Specimens were subjected to mechanical cycling (100N, 3Hz) with a piston with hemispherical tip (Ø=6 mm) until the specimens failed or up to 2×106 cycles. Every 500,000 cycles intervals, the fatigue tests were interrupted, and stereomicroscopy (10 X) was used to inspect the specimens for damage. We performed Weibull analysis of interval data to calculate the number of failures in each interval. Results The types and number of failures according to the groups were: cracking (Traditional-13, Modified-6) and chipping (Traditional-4) of the feldspathic porcelain, followed by delamination (Traditional-1) at the veneer/core interface and debonding (Monollithic-2) at the cementation interface. Weibull parameters (beta, scale; and eta, shape), with a two-sided confidence interval of 95%, were: Traditional – 1.25 and 0.9 × 106cycles; Modified– 0.58 and 11.7 × 106 cycles; and Monolithic – 1.05 and 16.5 × 106 cycles. Traditional crowns showed greater susceptibility to fatigue, the Modified group presented higher propensity to early failures, and the Monolithic group showed no susceptibility to fatigue. The Modified and Monolithic groups presented the highest number of crowns with no failures after the fatigue test. Conclusions The three crown designs presented significantly different behaviors under fatigue. The Modified and the Monolithic groups presented less probability to failure after 2×106cycles. PMID:26509988
Emperical Tests of Acceptance Sampling Plans
NASA Technical Reports Server (NTRS)
White, K. Preston, Jr.; Johnson, Kenneth L.
2012-01-01
Acceptance sampling is a quality control procedure applied as an alternative to 100% inspection. A random sample of items is drawn from a lot to determine the fraction of items which have a required quality characteristic. Both the number of items to be inspected and the criterion for determining conformance of the lot to the requirement are given by an appropriate sampling plan with specified risks of Type I and Type II sampling errors. In this paper, we present the results of empirical tests of the accuracy of selected sampling plans reported in the literature. These plans are for measureable quality characteristics which are known have either binomial, exponential, normal, gamma, Weibull, inverse Gaussian, or Poisson distributions. In the main, results support the accepted wisdom that variables acceptance plans are superior to attributes (binomial) acceptance plans, in the sense that these provide comparable protection against risks at reduced sampling cost. For the Gaussian and Weibull plans, however, there are ranges of the shape parameters for which the required sample sizes are in fact larger than the corresponding attributes plans, dramatically so for instances of large skew. Tests further confirm that the published inverse-Gaussian (IG) plan is flawed, as reported by White and Johnson (2011).
Metocean design parameter estimation for fixed platform based on copula functions
NASA Astrophysics Data System (ADS)
Zhai, Jinjin; Yin, Qilin; Dong, Sheng
2017-08-01
Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.
Motion of kinesin in a viscoelastic medium
NASA Astrophysics Data System (ADS)
Knoops, Gert; Vanderzande, Carlo
2018-05-01
Kinesin is a molecular motor that transports cargo along microtubules. The results of many in vitro experiments on kinesin-1 are described by kinetic models in which one transition corresponds to the forward motion and subsequent binding of the tethered motor head. We argue that in a viscoelastic medium like the cytosol of a cell this step is not Markov and has to be described by a nonexponential waiting time distribution. We introduce a semi-Markov kinetic model for kinesin that takes this effect into account. We calculate, for arbitrary waiting time distributions, the moment generating function of the number of steps made, and determine from this the average velocity and the diffusion constant of the motor. We illustrate our results for the case of a waiting time distribution that is Weibull. We find that for realistic parameter values, viscoelasticity decreases the velocity and the diffusion constant, but increases the randomness (or Fano factor).
In silico study on the effects of matrix structure in controlled drug release
NASA Astrophysics Data System (ADS)
Villalobos, Rafael; Cordero, Salomón; Maria Vidales, Ana; Domínguez, Armando
2006-07-01
Purpose: To study the effects of drug concentration and spatial distribution of the medicament, in porous solid dosage forms, on the kinetics and total yield of drug release. Methods: Cubic networks are used as models of drug release systems. They were constructed by means of the dual site-bond model framework, which allows a substrate to have adequate geometrical and topological distribution of its pore elements. Drug particles can move inside the networks by following a random walk model with excluded volume interactions between the particles. The drug release time evolution for different drug concentration and different initial drug spatial distribution has been monitored. Results: The numerical results show that in all the studied cases, drug release presents an anomalous behavior, and the consequences of the matrix structural properties, i.e., drug spatial distribution and drug concentration, on the drug release profile have been quantified. Conclusions: The Weibull function provides a simple connection between the model parameters and the microstructure of the drug release device. A critical modeling of drug release from matrix-type delivery systems is important in order to understand the transport mechanisms that are implicated, and to predict the effect of the device design parameters on the release rate.
Application of the weibull distribution function to the molecular weight distribution of cellulose
A. Broido; Hsiukang Yow
1977-01-01
The molecular weight distribution of a linear homologous polymer is usually obtained empirically for any particular sample. Sample-to-sample comparisons are made in terms of the weight- or number-average molecular weights and graphic displays of the distribution curves. Such treatment generally precludes data interpretations in which a distribution can be described in...
Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings
NASA Astrophysics Data System (ADS)
Tiryakioğlu, Murat
2009-07-01
A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.
Modeling the survival of Salmonella spp. in chorizos.
Hajmeer, M; Basheer, I; Hew, C; Cliver, D O
2006-03-01
The survival of Salmonella spp. in chorizos has been studied under the effect of storage conditions; namely temperature (T=6, 25, 30 degrees C), air inflow velocity (F=0, 28.4 m/min), and initial water activity (a(w0)=0.85, 0.90, 0.93, 0.95, 0.97). The pH was held at 5.0. A total of 20 survival curves were experimentally obtained at various combinations of operating conditions. The chorizos were stored under four conditions: in the refrigerator (Ref: T=6 degrees C, F=0 m/min), at room temperature (RT: T=25 degrees C, F=0 m/min), in the hood (Hd: T=25 degrees C, F=28.4 m/min), and in the incubator (Inc: T=30 degrees C, F=0 m/min). Semi-logarithmic plots of counts vs. time revealed nonlinear trends for all the survival curves, indicating that the first-order kinetics model (exponential distribution function) was not suitable. The Weibull cumulative distribution function, for which the exponential function is only a special case, was selected and used to model the survival curves. The Weibull model was fitted to the 20 curves and the model parameters (alpha and beta) were determined. The fitted survival curves agreed with the experimental data with R(2)=0.951, 0.969, 0.908, and 0.871 for the Ref, RT, Hd, and Inc curves, respectively. Regression models relating alpha and beta to T, F, and a(w0) resulted in R(2) values of 0.975 for alpha and 0.988 for beta. The alpha and beta models can be used to generate a survival curve for Salmonella in chorizos for a given set of operating conditions. Additionally, alpha and beta can be used to determine the times needed to reduce the count by 1 or 2 logs t(1D) and t(2D). It is concluded that the Weibull cumulative distribution function offers a powerful model for describing microbial survival data. A comparison with the pathogen modeling program (PMP) revealed that the survival kinetics of Salmonella spp. in chorizos could not be adequately predicted using PMP which underestimated the t(1D) and t(2D). The mean of the Weibull probability density function correlated strongly with t(1D) and t(2D), and can serve as an alternative to the D-values normally used with first-order kinetic models. Parametric studies were conducted and sensitivity of survival to operating conditions was evaluated and discussed in the paper. The models derived herein provide a means for the development of a reliable risk assessment system for controlling Salmonella spp. in chorizos.
Polynomial probability distribution estimation using the method of moments
Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949
Polynomial probability distribution estimation using the method of moments.
Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.
Pal, Suvra; Balakrishnan, Narayanaswamy
2018-05-01
In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.
Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending
NASA Astrophysics Data System (ADS)
Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus
2013-06-01
Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.
NASA Astrophysics Data System (ADS)
Premono, B. S.; Tjahjana, D. D. D. P.; Hadi, S.
2017-01-01
The aims of this paper are to investigate the characteristic of the wind speed and wind energy potential in the northern coastal region of Semarang, Central Java, Indonesia. The wind data was gained from Meteorological Station of Semarang, with ten-min average time series wind data for one year period, at the height of 10 m. Weibull distribution has been used to determine the wind power density and wind energy density of the site. It was shown that the value of the two parameters, shape parameter k, and scale parameter c, were 3.37 and 5.61 m/s, respectively. The annual mean wind speed and wind speed carrying the maximum energy were 5.32 m/s and 6.45 m/s, respectively. Further, the annual energy density at the site was found at a value of 103.87 W/m2, and based on Pacific North-west Laboratory (PNL) wind power classification, at the height of 10 m, the value of annual energy density is classified into class 2. The commercial wind turbine is chosen to simulate the wind energy potential of the site. The POLARIS P25-100 is most suitable to the site. It has the capacity factor 29.79% and can produce energy 261 MWh/year.
Li, Longbiao
2016-01-01
In this paper, the fatigue life of fiber-reinforced ceramic-matrix composites (CMCs) with different fiber preforms, i.e., unidirectional, cross-ply, 2D (two dimensional), 2.5D and 3D CMCs at room and elevated temperatures in air and oxidative environments, has been predicted using the micromechanics approach. An effective coefficient of the fiber volume fraction along the loading direction (ECFL) was introduced to describe the fiber architecture of preforms. The statistical matrix multicracking model and fracture mechanics interface debonding criterion were used to determine the matrix crack spacing and interface debonded length. Under cyclic fatigue loading, the fiber broken fraction was determined by combining the interface wear model and fiber statistical failure model at room temperature, and interface/fiber oxidation model, interface wear model and fiber statistical failure model at elevated temperatures, based on the assumption that the fiber strength is subjected to two-parameter Weibull distribution and the load carried by broken and intact fibers satisfies the Global Load Sharing (GLS) criterion. When the broken fiber fraction approaches the critical value, the composites fatigue fracture. PMID:28773332
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.
An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less
Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.; ...
2018-04-19
An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less
Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A
2018-06-01
Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Size Effect on Specific Energy Distribution in Particle Comminution
NASA Astrophysics Data System (ADS)
Xu, Yongfu; Wang, Yidong
A theoretical study is made to derive an energy distribution equation for the size reduction process from the fractal model for the particle comminution. Fractal model is employed as a valid measure of the self-similar size distribution of comminution daughter products. The tensile strength of particles varies with particle size in the manner of a power function law. The energy consumption for comminuting single particle is found to be proportional to the 5(D-3)/3rd order of the particle size, D being the fractal dimension of particle comminution daughter. The Weibull statistics is applied to describe the relationship between the breakage probability and specific energy of particle comminution. A simple equation is derived for the breakage probability of particles in view of the dependence of fracture energy on particle size. The calculated exponents and Weibull coefficients are generally in conformity with published data for fracture of particles.
Paes, P N G; Bastian, F L; Jardim, P M
2017-09-01
Consider the efficacy of glass infiltration etching (SIE) treatment as a procedure to modify the zirconia surface resulting in higher interfacial fracture toughness. Y-TZP was subjected to 5 different surface treatments conditions consisting of no treatment (G1), SIE followed by hydrofluoric acid treatment (G2), heat treated at 750°C (G3), hydrofluoric acid treated (G4) and airborne-particle abrasion with alumina particles (G5). The effect of surface treatment on roughness was evaluated by Atomic Force Microscopy providing three different parameters: R a , R sk and surface area variation. The ceramic/resin cement interface was analyzed by Fracture Mechanics K I test with failure mode determined by fractographic analysis. Weibull's analysis was also performed to evaluate the structural integrity of the adhesion zone. G2 and G4 specimens showed very similar, and high R a values but different surface area variation (33% for G2 and 13% for G4) and they presented the highest fracture toughness (K IC ). Weibull's analysis showed G2 (SIE) tendency to exhibit higher K IC values than the other groups but with more data scatter and a higher early failure probability than G4 specimens. Selective glass infiltration etching surface treatment was effective in modifying the zirconia surface roughness, increasing the bonding area and hence the mechanical imbrications at the zirconia/resin cement interface resulting in higher fracture toughness (K IC ) values with higher K IC values obtained when failure probability above 20% was expected (Weibull's distribution) among all the experimental groups. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Koseki, Shigenobu; Nakamura, Nobutaka; Shiina, Takeo
2015-01-01
Bacterial pathogens such as Listeria monocytogenes, Escherichia coli O157:H7, Salmonella enterica, and Cronobacter sakazakii have demonstrated long-term survival in/on dry or low-water activity (aw) foods. However, there have been few comparative studies on the desiccation tolerance among these bacterial pathogens separately in a same food matrix. In the present study, the survival kinetics of the four bacterial pathogens separately inoculated onto powdered infant formula as a model low-aw food was compared during storage at 5, 22, and 35°C. No significant differences in the survival kinetics between E. coli O157:H7 and L. monocytogenes were observed. Salmonella showed significantly higher desiccation tolerance than these pathogens, and C. sakazakii demonstrated significantly higher desiccation tolerance than all other three bacteria studied. Thus, the desiccation tolerance was represented as C. sakazakii > Salmonella > E. coli O157:H7 = L. monocytogenes. The survival kinetics of each bacterium was mathematically analyzed, and the observed kinetics was successfully described using the Weibull model. To evaluate the variability of the inactivation kinetics of the tested bacterial pathogens, the Monte Carlo simulation was performed using assumed probability distribution of the estimated fitted parameters. The simulation results showed that the storage temperature significantly influenced survival of each bacterium under the dry environment, where the bacterial inactivation became faster with increasing storage temperature. Furthermore, the fitted rate and shape parameters of the Weibull model were successfully modelled as a function of temperature. The numerical simulation of the bacterial inactivation was realized using the functions of the parameters under arbitrary fluctuating temperature conditions.
We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waters, A M
2001-05-01
In an effort to increase automobile fuel efficiency as well as decrease the output of harmful greenhouse gases, the automotive industry has recently shown increased interest in cast light metals such as magnesium alloys in an effort to increase weight savings. Currently several magnesium alloys such as AZ91 and AM60B are being used in structural applications for automobiles. However, these magnesium alloys are not as well characterized as other commonly used structural metals such as aluminum. This dissertation presents a methodology to nondestructively quantify damage accumulation due to void behavior in three dimensions in die-cast magnesium AM60B tensile bars asmore » a function of mechanical load. Computed tomography data was acquired after tensile bars were loaded up to and including failure, and analyzed to characterize void behavior as it relates to damage accumulation. Signal and image processing techniques were used along with a cluster labeling routine to nondestructively quantify damage parameters in three dimensions. Void analyses were performed including void volume distribution characterization, nearest neighbor distance calculations, shape parameters, and volumetric renderings of voids in the alloy. The processed CT data was used to generate input files for use in finite element simulations, both two- and three-dimensional. The void analyses revealed that the overwhelming source of failure in each tensile bar was a ring of porosity within each bar, possibly due to a solidification front inherent to the casting process. The measured damage parameters related to void nucleation, growth, and coalescence were shown to contribute significantly to total damage accumulation. Void volume distributions were characterized using a Weibull function, and the spatial distributions of voids were shown to be clustered. Two-dimensional finite element analyses of the tensile bars were used to fine-tune material damage models and a three-dimensional mesh of an extracted portion of one tensile bar including voids was generated from CT data and used as input to a finite element analysis.« less
Generalized extreme gust wind speeds distributions
Cheng, E.; Yeung, C.
2002-01-01
Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.
A fuzzy set approach for reliability calculation of valve controlling electric actuators
NASA Astrophysics Data System (ADS)
Karmachev, D. P.; Yefremov, A. A.; Luneva, E. E.
2017-02-01
The oil and gas equipment and electric actuators in particular frequently perform in various operational modes and under dynamic environmental conditions. These factors affect equipment reliability measures in a vague, uncertain way. To eliminate the ambiguity, reliability model parameters could be defined as fuzzy numbers. We suggest a technique that allows constructing fundamental fuzzy-valued performance reliability measures based on an analysis of electric actuators failure data in accordance with the amount of work, completed before the failure, instead of failure time. Also, this paper provides a computation example of fuzzy-valued reliability and hazard rate functions, assuming Kumaraswamy complementary Weibull geometric distribution as a lifetime (reliability) model for electric actuators.
Recurrence and interoccurrence behavior of self-organized complex phenomena
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.
2007-08-01
The sandpile, forest-fire and slider-block models are said to exhibit self-organized criticality. Associated natural phenomena include landslides, wildfires, and earthquakes. In all cases the frequency-size distributions are well approximated by power laws (fractals). Another important aspect of both the models and natural phenomena is the statistics of interval times. These statistics are particularly important for earthquakes. For earthquakes it is important to make a distinction between interoccurrence and recurrence times. Interoccurrence times are the interval times between earthquakes on all faults in a region whereas recurrence times are interval times between earthquakes on a single fault or fault segment. In many, but not all cases, interoccurrence time statistics are exponential (Poissonian) and the events occur randomly. However, the distribution of recurrence times are often Weibull to a good approximation. In this paper we study the interval statistics of slip events using a slider-block model. The behavior of this model is sensitive to the stiffness α of the system, α=kC/kL where kC is the spring constant of the connector springs and kL is the spring constant of the loader plate springs. For a soft system (small α) there are no system-wide events and interoccurrence time statistics of the larger events are Poissonian. For a stiff system (large α), system-wide events dominate the energy dissipation and the statistics of the recurrence times between these system-wide events satisfy the Weibull distribution to a good approximation. We argue that this applicability of the Weibull distribution is due to the power-law (scale invariant) behavior of the hazard function, i.e. the probability that the next event will occur at a time t0 after the last event has a power-law dependence on t0. The Weibull distribution is the only distribution that has a scale invariant hazard function. We further show that the onset of system-wide events is a well defined critical point. We find that the number of system-wide events NSWE satisfies the scaling relation NSWE ∝(α-αC)δ where αC is the critical value of the stiffness. The system-wide events represent a new phase for the slider-block system.
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
NASA Astrophysics Data System (ADS)
Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun
The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.
Application of a Probalistic Sizing Methodology for Ceramic Structures
NASA Astrophysics Data System (ADS)
Rancurel, Michael; Behar-Lafenetre, Stephanie; Cornillon, Laurence; Leroy, Francois-Henri; Coe, Graham; Laine, Benoit
2012-07-01
Ceramics are increasingly used in the space industry to take advantage of their stability and high specific stiffness properties. Their brittle behaviour often leads to size them by increasing the safety factors that are applied on the maximum stresses. It induces to oversize the structures. This is inconsistent with the major driver in space architecture, the mass criteria. This paper presents a methodology to size ceramic structures based on their failure probability. Thanks to failure tests on samples, the Weibull law which characterizes the strength distribution of the material is obtained. A-value (Q0.0195%) and B-value (Q0.195%) are then assessed to take into account the limited number of samples. A knocked-down Weibull law that interpolates the A- & B- values is also obtained. Thanks to these two laws, a most-likely and a knocked- down prediction of failure probability are computed for complex ceramic structures. The application of this methodology and its validation by test is reported in the paper.
Statistical analysis of lithium iron sulfide status cell cycle life and failure mode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gay, E.C.; Battles, J.E.; Miller, W.E.
1983-08-01
A statistical model was developed for life cycle testing of electrochemical cell life cycle trials and verified experimentally. The Weibull distribution was selected to predict the end of life for a cell, based on a 20 percent loss of initial stabilized capacity or a decrease to less than 95 percent coulombic efficiency. Groups of 12 or more Li-alloy/FeS cells were cycled to determine the mean time to failure (MTTF) and also to identify the failure modes. The cells were all full size electric vehicle batteries with 150-350 A-hr capacity. The Weibull shape factors were determined and verified in prediction ofmore » the number of cell failures in two 10 cell modules. The short circuit failure in the cells with BN-felt and MgO powder separators were found to be caused by the formation of Li-Al protrusions that penetrated the BN-felt separators, and the extrusion of active material at the edge of the electrodes.« less
Improved silicon carbide for advanced heat engines
NASA Technical Reports Server (NTRS)
Whalen, Thomas J.
1989-01-01
The development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines is studied. Injection molding was the forming method selected for the program because it is capable of forming complex parts adaptable for mass production on an economically sound basis. The goals were to reach a Weibull characteristic strength of 550 MPa (80 ksi) and a Weibull modulus of 16 for bars tested in four-point loading. Statistically designed experiments were performed throughout the program and a fluid mixing process employing an attritor mixer was developed. Compositional improvements in the amounts and sources of boron and carbon used and a pressureless sintering cycle were developed which provided samples of about 99 percent of theoretical density. Strengths were found to improve significantly by annealing in air. Strengths in excess of 550 MPa (80 ksi) with Weibull modulus of about 9 were obtained. Further improvements in Weibull modulus to about 16 were realized by proof testing. This is an increase of 86 percent in strength and 100 percent in Weibull modulus over the baseline data generated at the beginning of the program. Molding yields were improved and flaw distributions were observed to follow a Poisson process. Magic angle spinning nuclear magnetic resonance spectra were found to be useful in characterizing the SiC powder and the sintered samples. Turbocharger rotors were molded and examined as an indication of the moldability of the mixes which were developed in this program.
NASA Astrophysics Data System (ADS)
Costa, A.; Pioli, L.; Bonadonna, C.
2017-05-01
The authors found a mistake in the formulation of the distribution named Bi-Weibull distribution reported in the equation (A.2) of the Appendix A. The error affects equation (4) (which is the same as eq. (A.2)) and Table 4 in the original manuscript.
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
Availability Estimation for Facilities in Extreme Geographical Locations
NASA Technical Reports Server (NTRS)
Fischer, Gerd M.; Omotoso, Oluseun; Chen, Guangming; Evans, John W.
2012-01-01
A value added analysis for the Reliability. Availability and Maintainability of McMurdo Ground Station was developed, which will be a useful tool for system managers in sparing, maintenance planning and determining vital performance metrics needed for readiness assessment of the upgrades to the McMurdo System. Output of this study can also be used as inputs and recommendations for the application of Reliability Centered Maintenance (RCM) for the system. ReliaSoft's BlockSim. a commercial Reliability Analysis software package, has been used to model the availability of the system upgrade to the National Aeronautics and Space Administration (NASA) Near Earth Network (NEN) Ground Station at McMurdo Station in the Antarctica. The logistics challenges due to the closure of access to McMurdo Station during the Antarctic winter was modeled using a weighted composite of four Weibull distributions. one of the possible choices for statistical distributions throughout the software program and usually used to account for failure rates of components supplied by different manufacturers. The inaccessibility of the antenna site on a hill outside McMurdo Station throughout one year due to severe weather was modeled with a Weibull distribution for the repair crew availability. The Weibull distribution is based on an analysis of the available weather data for the antenna site for 2007 in combination with the rules for travel restrictions due to severe weather imposed by the administrating agency, the National Science Foundation (NSF). The simulations resulted in an upper bound for the system availability and allowed for identification of components that would improve availability based on a higher on-site spare count than initially planned.
Wang, Ping; Liu, Xiaoxia; Cao, Tian; Fu, Huihua; Wang, Ranran; Guo, Lixin
2016-09-20
The impact of nonzero boresight pointing errors on the system performance of decode-and-forward protocol-based multihop parallel optical wireless communication systems is studied. For the aggregated fading channel, the atmospheric turbulence is simulated by an exponentiated Weibull model, and pointing errors are described by one recently proposed statistical model including both boresight and jitter. The binary phase-shift keying subcarrier intensity modulation-based analytical average bit error rate (ABER) and outage probability expressions are achieved for a nonidentically and independently distributed system. The ABER and outage probability are then analyzed with different turbulence strengths, receiving aperture sizes, structure parameters (P and Q), jitter variances, and boresight displacements. The results show that aperture averaging offers almost the same system performance improvement with boresight included or not, despite the values of P and Q. The performance enhancement owing to the increase of cooperative path (P) is more evident with nonzero boresight than that with zero boresight (jitter only), whereas the performance deterioration because of the increasing hops (Q) with nonzero boresight is almost the same as that with zero boresight. Monte Carlo simulation is offered to verify the validity of ABER and outage probability expressions.
Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.
2017-07-17
The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized log-normal, generalized Pareto, and Weibull. Uncertainties in streamflow estimates for corresponding AEP are depicted and quantified as two primary forms: quantile (aleatoric [random sampling] uncertainty) and distribution-choice (epistemic [model] uncertainty). Sampling uncertainties of a given distribution are relatively straightforward to compute from analytical or Monte Carlo-based approaches. Distribution-choice uncertainty stems from choices of potentially applicable probability distributions for which divergence among the choices increases as AEP decreases. Conventional goodness-of-fit statistics, such as Cramér-von Mises, and L-moment ratio diagrams are demonstrated in order to hone distribution choice. The results generally show that distribution choice uncertainty is larger than sampling uncertainty for very low AEP values.
Recurrence time statistics of landslide events simulated by a cellular automaton model
NASA Astrophysics Data System (ADS)
Piegari, Ester; Di Maio, Rosa; Avella, Adolfo
2014-05-01
The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times, respectively. Finally, in such a region of the parameter space, clear indications of temporal correlations and clustering by the Fano factor behaviors support, at least in part, the analysis performed by Witt et al. (2010).
A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The two identified failure modes follow different acceleration functions. Catastrophic failures follow the traditional power-law relationship to the applied voltage. Slow degradation failures fit well to an exponential law relationship to the applied electrical field. Finally, the impact of capacitor structure on the reliability of BME capacitors is discussed with respect to the number of dielectric layers in an MLCC unit, the number of BaTiO3 grains per dielectric layer, and the chip size of the capacitor device.
NASA Astrophysics Data System (ADS)
Phillips, R. C.; Samadi, S. Z.; Meadows, M. E.
2018-07-01
This paper examines the frequency, distribution tails, and peak-over-threshold (POT) of extreme floods through analysis that centers on the October 2015 flooding in North Carolina (NC) and South Carolina (SC), United States (US). The most striking features of the October 2015 flooding were a short time to peak (Tp) and a multi-hour continuous flood peak which caused intensive and widespread damages to human lives, properties, and infrastructure. The 2015 flooding was produced by a sequence of intense rainfall events which originated from category 4 hurricane Joaquin over a period of four days. Here, the probability distribution and distribution parameters (i.e., location, scale, and shape) of floods were investigated by comparing the upper part of empirical distributions of the annual maximum flood (AMF) and POT with light- to heavy- theoretical tails: Fréchet, Pareto, Gumbel, Weibull, Beta, and Exponential. Specifically, four sets of U.S. Geological Survey (USGS) gauging data from the central Carolinas with record lengths from approximately 65-125 years were used. Analysis suggests that heavier-tailed distributions are in better agreement with the POT and somewhat AMF data than more often used exponential (light) tailed probability distributions. Further, the threshold selection and record length affect the heaviness of the tail and fluctuations of the parent distributions. The shape parameter and its evolution in the period of record play a critical and poorly understood role in determining the scaling of flood response to intense rainfall.
NASA Astrophysics Data System (ADS)
Mohamed, Refaat; Ismail, Mahmoud H.; Newagy, Fatma; Mourad, Heba M.
2013-03-01
Stemming from the fact that the α-μ fading distribution is one of the very general fading models used in the literature to describe the small scale fading phenomenon, in this paper, closed-form expressions for the Shannon capacity of the α-μ fading channel operating under four main adaptive transmission strategies are derived assuming integer values for μ. These expressions are derived for the case of no diversity as well as for selection combining diversity with independent and identically distributed branches. The obtained expressions reduce to those previously derived in the literature for the Weibull as well as the Rayleigh fading cases, which are both special cases of the α-μ channel. Numerical results are presented for the capacity under the four adaptive transmission strategies and the effect of the fading parameter as well as the number of diversity branches is studied.
Response kinetics of tethered bacteria to stepwise changes in nutrient concentration.
Chernova, Anna A; Armitage, Judith P; Packer, Helen L; Maini, Philip K
2003-09-01
We examined the changes in swimming behaviour of the bacterium Rhodobacter sphaeroides in response to stepwise changes in a nutrient (propionate), following the pre-stimulus motion, the initial response and the adaptation to the sustained concentration of the chemical. This was carried out by tethering motile cells by their flagella to glass slides and following the rotational behaviour of their cell bodies in response to the nutrient change. Computerised motion analysis was used to analyse the behaviour. Distributions of run and stop times were obtained from rotation data for tethered cells. Exponential and Weibull fits for these distributions, and variability in individual responses are discussed. In terms of parameters derived from the run and stop time distributions, we compare the responses to stepwise changes in the nutrient concentration and the long-term behaviour of 84 cells under 12 propionate concentration levels from 1 nM to 25 mM. We discuss traditional assumptions for the random walk approximation to bacterial swimming and compare them with the observed R. sphaeroides motile behaviour.
The topology of large Open Connectome networks for the human brain.
Gastner, Michael T; Ódor, Géza
2016-06-07
The structural human connectome (i.e. the network of fiber connections in the brain) can be analyzed at ever finer spatial resolution thanks to advances in neuroimaging. Here we analyze several large data sets for the human brain network made available by the Open Connectome Project. We apply statistical model selection to characterize the degree distributions of graphs containing up to nodes and edges. A three-parameter generalized Weibull (also known as a stretched exponential) distribution is a good fit to most of the observed degree distributions. For almost all networks, simple power laws cannot fit the data, but in some cases there is statistical support for power laws with an exponential cutoff. We also calculate the topological (graph) dimension D and the small-world coefficient σ of these networks. While σ suggests a small-world topology, we found that D < 4 showing that long-distance connections provide only a small correction to the topology of the embedding three-dimensional space.
The topology of large Open Connectome networks for the human brain
NASA Astrophysics Data System (ADS)
Gastner, Michael T.; Ódor, Géza
2016-06-01
The structural human connectome (i.e. the network of fiber connections in the brain) can be analyzed at ever finer spatial resolution thanks to advances in neuroimaging. Here we analyze several large data sets for the human brain network made available by the Open Connectome Project. We apply statistical model selection to characterize the degree distributions of graphs containing up to nodes and edges. A three-parameter generalized Weibull (also known as a stretched exponential) distribution is a good fit to most of the observed degree distributions. For almost all networks, simple power laws cannot fit the data, but in some cases there is statistical support for power laws with an exponential cutoff. We also calculate the topological (graph) dimension D and the small-world coefficient σ of these networks. While σ suggests a small-world topology, we found that D < 4 showing that long-distance connections provide only a small correction to the topology of the embedding three-dimensional space.
Microstructure and Mechanical Properties of Reaction-Formed Silicon Carbide (RFSC) Ceramics
NASA Technical Reports Server (NTRS)
Singh, M.; Behrendt, D. R.
1994-01-01
The microstructure and mechanical properties of reaction-formed silicon carbide (RFSC) ceramics fabricated by silicon infiltration of porous carbon preforms are discussed. The morphological characterization of the carbon preforms indicates a very narrow pore size distribution. Measurements of the preform density by physical methods and by mercury porosimetry agree very well and indicate that virtually all of the porosity in the preforms is open to infiltrating liquids. The average room temperature flexural strength of the RFSC material with approximately 8 at.% free silicon is 369 +/- 28 MPa (53.5 +/- 4 ksi). The Weibull strength distribution data give a characteristic strength value of 381 MPa (55 ksi) and a Weibull modulus of 14.3. The residual silicon content is lower and the strengths are superior to those of most commercially available reaction-bonded silicon carbide materials.
Modelling Limit Order Execution Times from Market Data
NASA Astrophysics Data System (ADS)
Kim, Adlar; Farmer, Doyne; Lo, Andrew
2007-03-01
Although the term ``liquidity'' is widely used in finance literatures, its meaning is very loosely defined and there is no quantitative measure for it. Generally, ``liquidity'' means an ability to quickly trade stocks without causing a significant impact on the stock price. From this definition, we identified two facets of liquidity -- 1.execution time of limit orders, and 2.price impact of market orders. The limit order is an order to transact a prespecified number of shares at a prespecified price, which will not cause an immediate execution. On the other hand, the market order is an order to transact a prespecified number of shares at a market price, which will cause an immediate execution, but are subject to price impact. Therefore, when the stock is liquid, market participants will experience quick limit order executions and small market order impacts. As a first step to understand market liquidity, we studied the facet of liquidity related to limit order executions -- execution times. In this talk, we propose a novel approach of modeling limit order execution times and show how they are affected by size and price of orders. We used q-Weibull distribution, which is a generalized form of Weibull distribution that can control the fatness of tail to model limit order execution times.
Equations for estimating loblolly pine branch and foliage weight and surface area distributions
V. Clark Baldwin; Kelly D. Peterson; Harold E. Burkhatt; Ralph L. Amateis; Phillip M. Dougherty
1996-01-01
Equations to predict foliage weight and surface area, and their vertical and horizontal distributions, within the crowns of unthinned loblolly pine (Pinus tueduL.) trees are presented. A right-truncated Weibull function was used for describing vertical foliage distributions. This function ensures that all of the foliage located between the tree tip and the foliage base...
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
Komada, Fusao
2018-01-01
The aim of this study was to investigate the time-to-onset of drug-induced interstitial lung disease (DILD) following the administration of small molecule molecularly-targeted drugs via the use of the spontaneous adverse reaction reporting system of the Japanese Adverse Drug Event Report database. DILD datasets for afatinib, alectinib, bortezomib, crizotinib, dasatinib, erlotinib, everolimus, gefitinib, imatinib, lapatinib, nilotinib, osimertinib, sorafenib, sunitinib, temsirolimus, and tofacitinib were used to calculate the median onset times of DILD and the Weibull distribution parameters, and to perform the hierarchical cluster analysis. The median onset times of DILD for afatinib, bortezomib, crizotinib, erlotinib, gefitinib, and nilotinib were within one month. The median onset times of DILD for dasatinib, everolimus, lapatinib, osimertinib, and temsirolimus ranged from 1 to 2 months. The median onset times of the DILD for alectinib, imatinib, and tofacitinib ranged from 2 to 3 months. The median onset times of the DILD for sunitinib and sorafenib ranged from 8 to 9 months. Weibull distributions for these drugs when using the cluster analysis showed that there were 4 clusters. Cluster 1 described a subgroup with early to later onset DILD and early failure type profiles or a random failure type profile. Cluster 2 exhibited early failure type profiles or a random failure type profile with early onset DILD. Cluster 3 exhibited a random failure type profile or wear out failure type profiles with later onset DILD. Cluster 4 exhibited an early failure type profile or a random failure type profile with the latest onset DILD.
Strength of Zerodur® for mirror applications
NASA Astrophysics Data System (ADS)
Béhar-Lafenêtre, S.; Cornillon, Laurence; Ait-Zaid, Sonia
2015-09-01
Zerodur® is a well-known glass-ceramic used for optical components because of its unequalled dimensional stability under thermal environment. In particular it has been used since decades in Thales Alenia Space's optical payloads for space telescopes, especially for mirrors. The drawback of Zerodur® is however its quite low strength, but the relatively small size of mirrors in the past had made it unnecessary to further investigate this aspect, although elementary tests have always shown higher failure strength. As performance of space telescopes is increasing, the size of mirrors increases accordingly, and an optimization of the design is necessary, mainly for mass saving. Therefore the question of the effective strength of Zerodur® has become a real issue. Thales Alenia Space has investigated the application of the Weibull law and associated size effects on Zerodur® in 2014, under CNES funding, through a thorough test campaign with a high number of samples (300) of various types. The purpose was to accurately determine the parameters of the Weibull law for Zerodur® when machined in the same conditions as mirrors. The proposed paper will discuss the obtained results, in the light of the Weibull theory. The applicability of the 2-parameter and 3-parameter (with threshold strength) laws will be compared. The expected size effect has not been evidenced therefore some investigations are led to determine the reasons of this result, from the test implementation quality to the data post-processing methodology. However this test campaign has already provided enough data to safely increase the allowable value for mirrors sizing.
Improved silicon carbide for advanced heat engines
NASA Technical Reports Server (NTRS)
Whalen, Thomas J.
1988-01-01
This is the third annual technical report for the program entitled, Improved Silicon Carbide for Advanced Heat Engines, for the period February 16, 1987 to February 15, 1988. The objective of the original program was the development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines. Injection molding is the forming method selected for the program because it is capable of forming complex parts adaptable for mass production on an economically sound basis. The goals of the revised program are to reach a Weibull characteristic strength of 550 MPa (80 ksi) and a Weibull modulus of 16 for bars tested in 4-point loading. Two tasks are discussed: Task 1 which involves materials and process improvements, and Task 2 which is a MOR bar matrix to improve strength and reliability. Many statistically designed experiments were completed under task 1 which improved the composition of the batches, the mixing of the powders, the sinter and anneal cycles. The best results were obtained by an attritor mixing process which yielded strengths in excess of 550 MPa (80 ksi) and an individual Weibull modulus of 16.8 for a 9-sample group. Strengths measured at 1200 and 1400 C were equal to the room temperature strength. Annealing of machined test bars significantly improved the strength. Molding yields were measured and flaw distributions were observed to follow a Poisson process. The second iteration of the Task 2 matrix experiment is described.
Garcés-Vega, Francisco; Marks, Bradley P
2014-08-01
In the last 20 years, the use of microbial reduction models has expanded significantly, including inactivation (linear and nonlinear), survival, and transfer models. However, a major constraint for model development is the impossibility to directly quantify the number of viable microorganisms below the limit of detection (LOD) for a given study. Different approaches have been used to manage this challenge, including ignoring negative plate counts, using statistical estimations, or applying data transformations. Our objective was to illustrate and quantify the effect of negative plate count data management approaches on parameter estimation for microbial reduction models. Because it is impossible to obtain accurate plate counts below the LOD, we performed simulated experiments to generate synthetic data for both log-linear and Weibull-type microbial reductions. We then applied five different, previously reported data management practices and fit log-linear and Weibull models to the resulting data. The results indicated a significant effect (α = 0.05) of the data management practices on the estimated model parameters and performance indicators. For example, when the negative plate counts were replaced by the LOD for log-linear data sets, the slope of the subsequent log-linear model was, on average, 22% smaller than for the original data, the resulting model underpredicted lethality by up to 2.0 log, and the Weibull model was erroneously selected as the most likely correct model for those data. The results demonstrate that it is important to explicitly report LODs and related data management protocols, which can significantly affect model results, interpretation, and utility. Ultimately, we recommend using only the positive plate counts to estimate model parameters for microbial reduction curves and avoiding any data value substitutions or transformations when managing negative plate counts to yield the most accurate model parameters.
Experimental Study on Fatigue Performance of Foamed Lightweight Soil
NASA Astrophysics Data System (ADS)
Qiu, Youqiang; Yang, Ping; Li, Yongliang; Zhang, Liujun
2017-12-01
In order to study fatigue performance of foamed lightweight soil and forecast its fatigue life in the supporting project, on the base of preliminary tests, beam fatigue tests on foamed lightweight soil is conducted by using UTM-100 test system. Based on Weibull distribution and lognormal distribution, using the mathematical statistics method, fatigue equations of foamed lightweight soil are obtained. At the same time, according to the traffic load on real road surface of the supporting project, fatigue life of formed lightweight soil is analyzed and compared with the cumulative equivalent axle loads during the design period of the pavement. The results show that even the fatigue life of foamed lightweight soil has discrete property, the linear relationship between logarithmic fatigue life and stress ratio still performs well. Especially, the fatigue life of Weibull distribution is more close to that derived from the lognormal distribution, in the instance of 50% guarantee ratio. In addition, the results demonstrated that foamed lightweight soil as subgrade filler has good anti-fatigue performance, which can be further adopted by other projects in the similar research domain.
Application of the Weibull extrapolation to 137Cs geochronology in Tokyo Bay and Ise Bay, Japan.
Lu, Xueqiang
2004-01-01
Considerable doubt surrounds the nature of processes by which 137Cs is deposited in marine sediments, leading to a situation where 137Cs geochronology cannot be always applied suitably. Based on extrapolation with Weibull distribution, the maximum concentration of 137Cs derived from asymptotic values for cumulative specific inventory was used to re-establish 137Cs geochronology, instead of original 137Cs profiles. Corresponding dating results for cores in Tokyo Bay and Ise Bay, Japan, by means of this new method, are in much closer agreement with those calculated from 210Pb method than the previous method.
Strain-controlled fatigue of acrylic bone cement.
Carter, D R; Gates, E I; Harris, W H
1982-09-01
Monotonic tensile tests and tension-compression fatigue tests were conducted of wet acrylic bone cement specimens at 37 degrees C. All testing was conducted in strain control at a strain rate of 0.02/s. Weibull analysis of the tensile tests indicated that monotonic fracture was governed more strongly by strain than stress. The number of cycles to fatigue failure was also more strongly controlled by strain amplitude than stress amplitude. Specimen porosity distribution played a major role in determining the tensile and fatigue strengths. The degree of data scatter suggests that Weibull analysis of fatigue data may be useful in developing design criteria for the surgical use of bone cement.
Renewal models and coseismic stress transfer in the Corinth Gulf, Greece, fault system
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Falcone, Giuseppe; Karakostas, Vassilis; Murru, Maura; Papadimitriou, Eleftheria; Rhoades, David
2013-07-01
model interevent times and Coulomb static stress transfer on the rupture segments along the Corinth Gulf extension zone, a region with a wealth of observations on strong-earthquake recurrence behavior. From the available information on past seismic activity, we have identified eight segments without significant overlapping that are aligned along the southern boundary of the Corinth rift. We aim to test if strong earthquakes on these segments are characterized by some kind of time-predictable behavior, rather than by complete randomness. The rationale for time-predictable behavior is based on the characteristic earthquake hypothesis, the necessary ingredients of which are a known faulting geometry and slip rate. The tectonic loading rate is characterized by slip of 6 mm/yr on the westernmost fault segment, diminishing to 4 mm/yr on the easternmost segment, based on the most reliable geodetic data. In this study, we employ statistical and physical modeling to account for stress transfer among these fault segments. The statistical modeling is based on the definition of a probability density distribution of the interevent times for each segment. Both the Brownian Passage-Time (BPT) and Weibull distributions are tested. The time-dependent hazard rate thus obtained is then modified by the inclusion of a permanent physical effect due to the Coulomb static stress change caused by failure of neighboring faults since the latest characteristic earthquake on the fault of interest. The validity of the renewal model is assessed retrospectively, using the data of the last 300 years, by comparison with a plain time-independent Poisson model, by means of statistical tools including the Relative Operating Characteristic diagram, the R-score, the probability gain and the log-likelihood ratio. We treat the uncertainties in the parameters of each examined fault source, such as linear dimensions, depth of the fault center, focal mechanism, recurrence time, coseismic slip, and aperiodicity of the statistical distribution, by a Monte Carlo technique. The Monte Carlo samples for all these parameters are drawn from a uniform distribution within their uncertainty limits. We find that the BPT and the Weibull renewal models yield comparable results, and both of them perform significantly better than the Poisson hypothesis. No clear performance enhancement is achieved by the introduction of the Coulomb static stress change into the renewal model.
NASA Astrophysics Data System (ADS)
Rotondi, Renata; Varini, Elisa
2016-04-01
The long-term recurrence of strong earthquakes is often modelled by the stationary Poisson process for the sake of simplicity, although renewal and self-correcting point processes (with non-decreasing hazard functions) are more appropriate. Short-term models mainly fit earthquake clusters due to the tendency of an earthquake to trigger other earthquakes; in this case, self-exciting point processes with non-increasing hazard are especially suitable. In order to provide a unified framework for analyzing earthquake catalogs, Schoenberg and Bolt proposed the SELC (Short-term Exciting Long-term Correcting) model (BSSA, 2000) and Varini employed a state-space model for estimating the different phases of a seismic cycle (PhD Thesis, 2005). Both attempts are combinations of long- and short-term models, but results are not completely satisfactory, due to the different scales at which these models appear to operate. In this study, we split a seismic sequence in two groups: the leader events, whose magnitude exceeds a threshold magnitude, and the remaining ones considered as subordinate events. The leader events are assumed to follow a well-known self-correcting point process named stress release model (Vere-Jones, J. Phys. Earth, 1978; Bebbington & Harte, GJI, 2003, Varini & Rotondi, Env. Ecol. Stat., 2015). In the interval between two subsequent leader events, subordinate events are expected to cluster at the beginning (aftershocks) and at the end (foreshocks) of that interval; hence, they are modeled by a failure processes that allows bathtub-shaped hazard function. In particular, we have examined the generalized Weibull distributions, a large family that contains distributions with different bathtub-shaped hazard as well as the standard Weibull distribution (Lai, Springer, 2014). The model is fitted to a dataset of Italian historical earthquakes and the results of Bayesian inference are shown.
Spatial variation of statistical properties of extreme water levels along the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Pindsoo, Katri; Soomere, Tarmo; Rocha, Eugénio
2016-04-01
Most of existing projections of future extreme water levels rely on the use of classic generalised extreme value distributions. The choice to use a particular distribution is often made based on the absolute value of the shape parameter of the Generalise Extreme Value distribution. If this parameter is small, the Gumbel distribution is most appropriate while in the opposite case the Weibull or Frechet distribution could be used. We demonstrate that the alongshore variation in the statistical properties of numerically simulated high water levels along the eastern coast of the Baltic Sea is so large that the use of a single distribution for projections of extreme water levels is highly questionable. The analysis is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. The output of the Rossby Centre Ocean model is sampled with a resolution of 6 h and the output of the circulation model NEMO with a resolution of 1 h. As the maxima of water levels of subsequent years may be correlated in the Baltic Sea, we also employ maxima for stormy seasons. We provide a detailed analysis of spatial variation of the parameters of the family of extreme value distributions along an approximately 600 km long coastal section from the north-western shore of Latvia in the Baltic Proper until the eastern Gulf of Finland. The parameters are evaluated using maximum likelihood method and method of moments. The analysis also covers the entire Gulf of Riga. The core parameter of this family of distributions, the shape parameter of the Generalised Extreme Value distribution, exhibits extensive variation in the study area. Its values evaluated using the Hydrognomon software and maximum likelihood method, vary from about -0.1 near the north-western coast of Latvia in the Baltic Proper up to about 0.05 in the eastern Gulf of Finland. This parameter is very close to zero near Tallinn in the western Gulf of Finland. Thus, it is natural that the Gumbel distribution gives adequate projections of extreme water levels for the vicinity of Tallinn. More importantly, this feature indicates that the use of a single distribution for the projections of extreme water levels and their return periods for the entire Baltic Sea coast is inappropriate. The physical reason is the interplay of the complex shape of large subbasins (such as the Gulf of Riga and Gulf of Finland) of the sea and highly anisotropic wind regime. The 'impact' of this anisotropy on the statistics of water level is amplified by the overall anisotropy of the distributions of the frequency of occurrence of high and low water levels. The most important conjecture is that long-term behaviour of water level extremes in different coastal sections of the Baltic Sea may be fundamentally different.
Statistical Models of Fracture Relevant to Nuclear-Grade Graphite: Review and Recommendations
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bratton, Robert L.
2011-01-01
The nuclear-grade (low-impurity) graphite needed for the fuel element and moderator material for next-generation (Gen IV) reactors displays large scatter in strength and a nonlinear stress-strain response from damage accumulation. This response can be characterized as quasi-brittle. In this expanded review, relevant statistical failure models for various brittle and quasi-brittle material systems are discussed with regard to strength distribution, size effect, multiaxial strength, and damage accumulation. This includes descriptions of the Weibull, Batdorf, and Burchell models as well as models that describe the strength response of composite materials, which involves distributed damage. Results from lattice simulations are included for a physics-based description of material breakdown. Consideration is given to the predicted transition between brittle and quasi-brittle damage behavior versus the density of damage (level of disorder) within the material system. The literature indicates that weakest-link-based failure modeling approaches appear to be reasonably robust in that they can be applied to materials that display distributed damage, provided that the level of disorder in the material is not too large. The Weibull distribution is argued to be the most appropriate statistical distribution to model the stochastic-strength response of graphite.
Geographic location, network patterns and population distribution of rural settlements in Greece
NASA Astrophysics Data System (ADS)
Asimakopoulos, Avraam; Mogios, Emmanuel; Xenikos, Dimitrios G.
2016-10-01
Our work addresses the problem of how social networks are embedded in space, by studying the spread of human population over complex geomorphological terrain. We focus on villages or small cities up to a few thousand inhabitants located in mountainous areas in Greece. This terrain presents a familiar tree-like structure of valleys and land plateaus. Cities are found more often at lower altitudes and exhibit preference on south orientation. Furthermore, the population generally avoids flat land plateaus and river beds, preferring locations slightly uphill, away from the plateau edge. Despite the location diversity regarding geomorphological parameters, we find certain quantitative norms when we examine location and population distributions relative to the (man-made) transportation network. In particular, settlements at radial distance ℓ away from road network junctions have the same mean altitude, practically independent of ℓ ranging from a few meters to 10 km. Similarly, the distribution of the settlement population at any given ℓ is the same for all ℓ. Finally, the cumulative distribution of the number of rural cities n(ℓ) is fitted to the Weibull distribution, suggesting that human decisions for creating settlements could be paralleled to mechanisms typically attributed to this particular statistical distribution.
Life and reliability models for helicopter transmissions
NASA Technical Reports Server (NTRS)
Savage, M.; Knorr, R. J.; Coy, J. J.
1982-01-01
Computer models of life and reliability are presented for planetary gear trains with a fixed ring gear, input applied to the sun gear, and output taken from the planet arm. For this transmission the input and output shafts are co-axial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. The reliability model is based on the Weibull distributions of the individual reliabilities of the in transmission components. The system model is also a Weibull distribution. The load versus life model for the system is a power relationship as the models for the individual components. The load-life exponent and basic dynamic capacity are developed as functions of the components capacities. The models are used to compare three and four planet, 150 kW (200 hp), 5:1 reduction transmissions with 1500 rpm input speed to illustrate their use.
Survival Analysis of Patients with End Stage Renal Disease
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
Observation of radiation damage induced by single-ion hits at the heavy ion microbeam system
NASA Astrophysics Data System (ADS)
Kamiya, Tomihiro; Sakai, Takuro; Hirao, Toshio; Oikawa, Masakazu
2001-07-01
A single-ion hit system combined with the JAERI heavy ion microbeam system can be applied to observe individual phenomena induced by interactions between high-energy ions and a semiconductor device using a technique to measure the pulse height of transient current (TC) signals. The reduction of the TC pulse height for a Si PIN photodiode was measured under irradiation of 15 MeV Ni ions onto various micron-sized areas in the diode. The data containing damage effect by these irradiations were analyzed with least-square fitting using a Weibull distribution function. Changes of the scale and the shape parameters as functions of the width of irradiation areas brought us an assumption that a charge collection in a diode has a micron level lateral extent larger than a spatial resolution of the microbeam at 1 μm. Numerical simulations for these measurements were made with a simplified two-dimensional model based on this assumption using a Monte Carlo method. Calculated data reproducing the pulse-height reductions by single-ion irradiations were analyzed using the same function as that for the measurement. The result of this analysis, which shows the same tendency in change of parameters as that by measurements, seems to support our assumption.
NASA Astrophysics Data System (ADS)
Longbiao, Li
2017-08-01
In this paper, the synergistic effects of temperature, oxidation and multicracking modes on damage evolution and life prediction in 2D woven ceramic-matrix composites (CMCs) have been investigated. The damage parameter of fatigue hysteresis dissipated energy and the interface shear stress were used to monitor the damage evolution inside of CMCs. Under cyclic fatigue loading, the fibers broken fraction was determined by combining the interface/fiber oxidation model, interface wear model and fibers statistical failure model at elevated temperature, based on the assumption that the fiber strength is subjected to two-parameter Weibull distribution and the load carried by broken and intact fibers satisfy the Global Load Sharing (GLS) criterion. When the broken fibers fraction approaches to the critical value, the composite fatigue fractures. The evolution of fatigue hysteresis dissipated energy, the interface shear stress and broken fibers fraction versus cycle number, and the fatigue life S-N curves of SiC/SiC at 1000, 1200 and 1300 °C in air and steam condition have been predicted. The synergistic effects of temperature, oxidation, fatigue peak stress, and multicracking modes on the evolution of interface shear stress and fatigue hysteresis dissipated energy versus cycle numbers curves have been analyzed.
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1990-01-01
The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.
Strength of a Ceramic Sectored Flexure Specimen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A; Duffy, Stephen F; Baker, E. H.
2008-01-01
A new test specimen, defined here as the "sectored flexure strength specimen", was developed to measure the strength of ceramic tubes specifically for circumstances when flaws located at the tube's outer diameter are the strength-limiter and subjected to axial tension. The understanding of such strength-limitation is relevant for when ceramic tubes are subjected to bending or when the internal temperature is hotter than the tube's exterior (e.g., heat exchangers). The specimen is both economically and statistically attractive because eight specimens (eight in the case of this project - but the user is not necessarily limited to eight) were extracted outmore » of each length of tube. An analytic expression for maximum or failure stress, and relationships portraying effective area and effective volume as a function of Weibull modulus were developed. Lastly, it was proven from the testing of two ceramics that the sectored flexure specimen was very effective at producing failures caused by strength-limiting flaws located on the tube's original outer diameter. Keywords: ceramics, strength, sectored flexure specimen, effective area, effective volume, finite-element analysis, Weibull distribution, and fractography.« less
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1992-01-01
Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.
Evolution of Self-Organization in Adiabatic Shear Bands
NASA Astrophysics Data System (ADS)
Meyers, Marc A.; Xue, Qing; Nesterenko, Vitali F.
2001-06-01
The evolution of multiple adiabatic shear bands was investigated in stainless steel, an Fe-15%Cr-15% Ni alloy, titanium, and Ti-6%Al-4%V alloy through the radial collapse of a thick-walled cylinder under high-strain-rate deformation ( 10^4 s-1). The shear-band initiation, propagation, as well as spatial distribution were examined under different global strains(varied from 0 to 0.9). The shear-band spacing is compared with one-dimensional theoretical predictions based on perturbation (Ockendon- Wright and Molinari) and momentum diffusion (Grady-Kipp). The experimentally observed spacing reveals the two-dimensional character of self-organization. These aspects are incorporated into a novel analytical description, in which a distribution of embryos(potential initiation sites) is activated as a function of strain (greater than a threshold) accoding to a Weibull-type distribution. The model incorporates embryo disactivation by stress shielding as well as selective growth of shear bands. The imposed strain rate, embryo distribution, and rates of initiation and propagation determine the evolutionary shear band configurations. The microstructural parameter investigated for stainless steel was the grain size, that was varied from 30 and 500 um. The influence of grain size was found to be minor and through the flow stress. Titanium and Ti-6%Al-4%V displayed drastically different patterns of shear bands,which are explained in terms of the model proposed. Research Supported by US Army Research Office MURI Program (Contract DAAH 04-96-1-0376).
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.
NASA Astrophysics Data System (ADS)
Taravat, A.; Del Frate, F.
2013-09-01
As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
Hyperchaotic Dynamics for Light Polarization in a Laser Diode
NASA Astrophysics Data System (ADS)
Bonatto, Cristian
2018-04-01
It is shown that a highly randomlike behavior of light polarization states in the output of a free-running laser diode, covering the whole Poincaré sphere, arises as a result from a fully deterministic nonlinear process, which is characterized by a hyperchaotic dynamics of two polarization modes nonlinearly coupled with a semiconductor medium, inside the optical cavity. A number of statistical distributions were found to describe the deterministic data of the low-dimensional nonlinear flow, such as lognormal distribution for the light intensity, Gaussian distributions for the electric field components and electron densities, Rice and Rayleigh distributions, and Weibull and negative exponential distributions, for the modulus and intensity of the orthogonal linear components of the electric field, respectively. The presented results could be relevant for the generation of single units of compact light source devices to be used in low-dimensional optical hyperchaos-based applications.
A Novel Solution-Technique Applied to a Novel WAAS Architecture
NASA Technical Reports Server (NTRS)
Bavuso, J.
1998-01-01
The Federal Aviation Administration has embarked on an historic task of modernizing and significantly improving the national air transportation system. One system that uses the Global Positioning System (GPS) to determine aircraft navigational information is called the Wide Area Augmentation System (WAAS). This paper describes a reliability assessment of one candidate system architecture for the WAAS. A unique aspect of this study regards the modeling and solution of a candidate system that allows a novel cold sparing scheme. The cold spare is a WAAS communications satellite that is fabricated and launched after a predetermined number of orbiting satellite failures have occurred and after some stochastic fabrication time transpires. Because these satellites are complex systems with redundant components, they exhibit an increasing failure rate with a Weibull time to failure distribution. Moreover, the cold spare satellite build-time is Weibull and upon launch is considered to be a good-as-new system with an increasing failure rate and a Weibull time to failure distribution as well. The reliability model for this system is non-Markovian because three distinct system clocks are required: the time to failure of the orbiting satellites, the build time for the cold spare, and the time to failure for the launched spare satellite. A powerful dynamic fault tree modeling notation and Monte Carlo simulation technique with importance sampling are shown to arrive at a reliability prediction for a 10 year mission.
Riobó, P; Paz, B; Franco, J M; Vázquez, J A; Murado, M A; Cacho, E
2008-08-01
Nowadays, a variety of protocols are applied to quantitate palytoxin. However, there is not desirable agreement among them, the confidence intervals of the basic toxicological parameters are too wide and the formal descriptions lack the necessary generality to establish comparisons. Currently, the mouse bioassay is the most accepted one to categorize marine toxins and it must constitute the reference for other methods. In the present work, the mouse bioassay for palytoxin is deeply analyzed and carefully described showing the initial symptoms of injected mice which are presented here in the first time. These symptoms clearly differ from the more common marine toxins described up to now. Regarding to the toxicological aspects two considerations are taking into account: (i) the empiric models based in the dose-death time relationships cause serious ambiguities and (ii) the traditional moving average method contains in its regular use any inaccuracy elements. Herein is demonstrated that the logistic equation and the accumulative function of Weibull's distribution (with the modifications proposed) generate satisfactory toxicological descriptions in all the respects.
Study on Excitation-triggered Damage Mechanism in Perilous Rock
NASA Astrophysics Data System (ADS)
Chen, Hongkai; Wang, Shengjuan
2017-12-01
Chain collapse is easy to happen for perilous rock aggregate locating on steep high slope, and one of the key scientific problems is the damage mechanism of perilous rock under excitation action at perilous rock rupture. This paper studies excitation-triggered damage mechanism in perilous rock by wave mechanics, which gives three conclusions. Firstly, when only the normal incidence attenuation spread of excitation wave is considered, while the energy loss is ignored for excitation wave to spread in perilous rock aggregate, the paper establishes one method to calculate peak velocity when excitation wave passes through boundary between any two perilous rock blocks in perilous rock aggregate. Secondly, following by Sweden and Canmet criteria, the paper provides one wave velocity criterion for excitation-triggered damage in the aggregate. Thirdly, assuming double parameters of volume strain of cracks or fissures in rock meet the Weibull distribution, one method to estimate micro-fissure in excitation-triggered damage zone in perilous rock aggregate is established. The studies solve the mechanical description problem for excitation-triggered damage in perilous rock, which is valuable in studies on profoundly rupture mechanism.
NASA Astrophysics Data System (ADS)
Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian
2018-03-01
The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.
Suspension thermal spraying of hydroxyapatite: microstructure and in vitro behaviour.
Bolelli, Giovanni; Bellucci, Devis; Cannillo, Valeria; Lusvarghi, Luca; Sola, Antonella; Stiegler, Nico; Müller, Philipp; Killinger, Andreas; Gadow, Rainer; Altomare, Lina; De Nardo, Luigi
2014-01-01
In cementless fixation of metallic prostheses, bony ingrowth onto the implant surface is often promoted by osteoconductive plasma-sprayed hydroxyapatite coatings. The present work explores the use of the innovative High Velocity Suspension Flame Spraying (HVSFS) process to coat Ti substrates with thin homogeneous hydroxyapatite coatings. The HVSFS hydroxyapatite coatings studied were dense, 27-37μm thick, with some transverse microcracks. Lamellae were sintered together and nearly unidentifiable, unlike conventional plasma-sprayed hydroxyapatite. Crystallinities of 10%-70% were obtained, depending on the deposition parameters and the use of a TiO2 bond coat. The average hardness of layers with low (<24%) and high (70%) crystallinity was ≈3.5GPa and ≈4.5GPa respectively. The distributions of hardness values, all characterised by Weibull modulus in the 5-7 range, were narrower than that of conventional plasma-sprayed hydroxyapatite, with a Weibull modulus of ≈3.3. During soaking in simulated body fluid, glassy coatings were progressively resorbed and replaced by a new, precipitated hydroxyapatite layer, whereas coatings with 70% crystallinity were stable up to 14days of immersion. The interpretation of the precipitation behaviour was also assisted by surface charge assessments, performed through Z-potential measurements. During in vitro tests, HA coatings showed no cytotoxicity towards the SAOS-2 osteoblast cell line, and surface cell proliferation was comparable with proliferation on reference polystyrene culture plates. © 2013.
The Application of a Novel Ceramic Liner Improves Bonding between Zirconia and Veneering Porcelain
Lee, Hee-Sung
2017-01-01
The adhesion of porcelain to zirconia is a key factor in the success of bilayered restorations. In this study, the efficacy of a novel experimental liner (EL) containing zirconia for improved bonding between zirconia and veneering porcelain was tested. Four ELs containing various concentrations (0, 3.0, 6.0, and 9.0 wt %) of zirconia were prepared. Testing determined the most effective EL (EL3 containing 3.0 wt % zirconia) in terms of shear bond strength value (n = 15). Three different bar-shaped zirconia/porcelain bilayer specimens were prepared for a three-point flexural strength (TPFS) test (n = 15): no-liner (NL), commercial liner (CL), and EL3. Specimens were tested for TPFS with the porcelain under tension and the maximum load was measured at the first sign of fracture. The strength data were analyzed using one-way ANOVA and Tukey’s test (α = 0.05) as well as Weibull distribution. When compared to NL, the CL application had no effect, while the EL3 application had a significant positive effect (p < 0.001) on the flexural strength. Weibull analysis also revealed the highest shape and scale parameters for group EL3. Within the limitations of this study, the novel ceramic liner containing 3.0 wt % zirconia (EL3) significantly enhanced the zirconia/porcelain interfacial bonding. PMID:28869512
Caraviello, D Z; Weigel, K A; Gianola, D
2004-05-01
Predicted transmitting abilities (PTA) of US Jersey sires for daughter longevity were calculated using a Weibull proportional hazards sire model and compared with predictions from a conventional linear animal model. Culling data from 268,008 Jersey cows with first calving from 1981 to 2000 were used. The proportional hazards model included time-dependent effects of herd-year-season contemporary group and parity by stage of lactation interaction, as well as time-independent effects of sire and age at first calving. Sire variances and parameters of the Weibull distribution were estimated, providing heritability estimates of 4.7% on the log scale and 18.0% on the original scale. The PTA of each sire was expressed as the expected risk of culling relative to daughters of an average sire. Risk ratios (RR) ranged from 0.7 to 1.3, indicating that the risk of culling for daughters of the best sires was 30% lower than for daughters of average sires and nearly 50% lower than than for daughters of the poorest sires. Sire PTA from the proportional hazards model were compared with PTA from a linear model similar to that used for routine national genetic evaluation of length of productive life (PL) using cross-validation in independent samples of herds. Models were compared using logistic regression of daughters' stayability to second, third, fourth, or fifth lactation on their sires' PTA values, with alternative approaches for weighting the contribution of each sire. Models were also compared using logistic regression of daughters' stayability to 36, 48, 60, 72, and 84 mo of life. The proportional hazards model generally yielded more accurate predictions according to these criteria, but differences in predictive ability between methods were smaller when using a Kullback-Leibler distance than with other approaches. Results of this study suggest that survival analysis methodology may provide more accurate predictions of genetic merit for longevity than conventional linear models.
1989-08-01
Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S
Yoganandan, Narayan; Arun, Mike W.J.; Pintar, Frank A.; Szabo, Aniko
2015-01-01
Objective Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. Methods The study re-examined lower leg PMHS data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and non-injury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the co-variable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal and log-logistic distributions was based on the Akaike Information Criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. Results The mean age, stature and weight: 58.2 ± 15.1 years, 1.74 ± 0.08 m and 74.9 ± 13.8 kg. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other two distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-old at five, 25 and 50% risk levels age groups for lower leg fracture. For 25, 45 and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. Conclusions This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines. PMID:25307381
NASA Astrophysics Data System (ADS)
Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.
2011-07-01
Engineering structures must be designed for an extremely low failure probability such as 10 -6, which is beyond the means of direct verification by histogram testing. This is not a problem for brittle or ductile materials because the type of probability distribution of structural strength is fixed and known, making it possible to predict the tail probabilities from the mean and variance. It is a problem, though, for quasibrittle materials for which the type of strength distribution transitions from Gaussian to Weibullian as the structure size increases. These are heterogeneous materials with brittle constituents, characterized by material inhomogeneities that are not negligible compared to the structure size. Examples include concrete, fiber composites, coarse-grained or toughened ceramics, rocks, sea ice, rigid foams and bone, as well as many materials used in nano- and microscale devices. This study presents a unified theory of strength and lifetime for such materials, based on activation energy controlled random jumps of the nano-crack front, and on the nano-macro multiscale transition of tail probabilities. Part I of this study deals with the case of monotonic and sustained (or creep) loading, and Part II with fatigue (or cyclic) loading. On the scale of the representative volume element of material, the probability distribution of strength has a Gaussian core onto which a remote Weibull tail is grafted at failure probability of the order of 10 -3. With increasing structure size, the Weibull tail penetrates into the Gaussian core. The probability distribution of static (creep) lifetime is related to the strength distribution by the power law for the static crack growth rate, for which a physical justification is given. The present theory yields a simple relation between the exponent of this law and the Weibull moduli for strength and lifetime. The benefit is that the lifetime distribution can be predicted from short-time tests of the mean size effect on strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.
NASA Astrophysics Data System (ADS)
Kaoga, Dieudonné Kidmo; Bogno, Bachirou; Aillerie, Michel; Raidandi, Danwe; Yamigno, Serge Doka; Hamandjoda, Oumarou; Tibi, Beda
2016-07-01
In this work, 28 years of wind data, measured at 10m above ground level (AGL), from Maroua meteorological station is utilized to assess the potential of wind energy at exposed ridges tops of mountains surrounding the city of Maroua. The aim of this study is to estimate the cost of wind-generated electricity using six types of wind turbines (50 to 2000 kW). The Weibull distribution function is employed to estimate Weibull shape and scale parameters using the energy pattern factor method. The considered wind shear model to extrapolate Weibull parameters and wind profiles is the empirical power law correlation. The results show that hilltops in the range of 150-350m AGL in increments of 50, fall under Class 3 or greater of the international system of wind classification and are deemed suitable to outstanding for wind turbine applications. A performance of the selected wind turbines is examined as well as the costs of wind-generated electricity at the considered hilltops. The results establish that the lowest costs per kWh are obtained using YDF-1500-87 (1500 kW) turbine while the highest costs are delivered by P-25-100 (90 kW). The lowest costs (US) per kWh of electricity generated are found to vary between a minimum of 0.0294 at hilltops 350m AGL and a maximum of 0.0366 at hilltops 150m AGL, with corresponding energy outputs that are 6,125 and 4,932 MWh, respectively. Additionally, the matching capacity factors values are 38.05% at hilltops 150m AGL and 47.26% at hilltops 350m AGL. Furthermore, YDF-1500-87 followed by Enercon E82-2000 (2000 kW) wind turbines provide the lowest cost of wind generated electricity and are recommended for use for large communities. Medium wind turbine P-15-50 (50 kW), despite showing the best coefficients factors (39.29% and 48.85% at hilltops 150 and 350m AGL, in that order), generates electricity at an average higher cost/kWh of US0.0547 and 0.0440 at hilltops 150 and 350m AGL, respectively. P-15-50 is deemed a more advantageous option for off-grid electrification of small and remote communities.
Statistical theory on the analytical form of cloud particle size distributions
NASA Astrophysics Data System (ADS)
Wu, Wei; McFarquhar, Greg
2017-11-01
Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.
Karlinger, M.R.; Troutman, B.M.
1985-01-01
An instantaneous unit hydrograph (iuh) based on the theory of topologically random networks (topological iuh) is evaluated in terms of sets of basin characteristics and hydraulic parameters. Hydrographs were computed using two linear routing methods for each of two drainage basins in the southeastern United States and are the basis of comparison for the topological iuh's. Elements in the sets of basin characteristics for the topological iuh's are the number of first-order streams only, (N), or the nuber of sources together with the number of channel links in the topological diameter (N, D); the hydraulic parameters are values of the celerity and diffusivity constant. Sensitivity analyses indicate that the mean celerity of the internal links in the network is the critical hydraulic parameter for determining the shape of the topological iuh, while the diffusivity constant has minimal effect on the topological iuh. Asymptotic results (source-only) indicate the number of sources need not be large to approximate the topological iuh with the Weibull probability density function.
NASA Technical Reports Server (NTRS)
Ricks, Trenton M.; Lacy, Thomas E., Jr.; Bednarcyk, Brett A.; Arnold, Steven M.; Hutchins, John W.
2014-01-01
A multiscale modeling methodology was developed for continuous fiber composites that incorporates a statistical distribution of fiber strengths into coupled multiscale micromechanics/finite element (FE) analyses. A modified two-parameter Weibull cumulative distribution function, which accounts for the effect of fiber length on the probability of failure, was used to characterize the statistical distribution of fiber strengths. A parametric study using the NASA Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) was performed to assess the effect of variable fiber strengths on local composite failure within a repeating unit cell (RUC) and subsequent global failure. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a unidirectional SCS-6/TIMETAL 21S metal matrix composite tensile dogbone specimen at 650 degC. Multiscale progressive failure analyses were performed to quantify the effect of spatially varying fiber strengths on the RUC-averaged and global stress-strain responses and failure. The ultimate composite strengths and distribution of failure locations (predominately within the gage section) reasonably matched the experimentally observed failure behavior. The predicted composite failure behavior suggests that use of macroscale models that exploit global geometric symmetries are inappropriate for cases where the actual distribution of local fiber strengths displays no such symmetries. This issue has not received much attention in the literature. Moreover, the model discretization at a specific length scale can have a profound effect on the computational costs associated with multiscale simulations.models that yield accurate yet tractable results.
Relationships between Perron-Frobenius eigenvalue and measurements of loops in networks
NASA Astrophysics Data System (ADS)
Chen, Lei; Kou, Yingxin; Li, Zhanwu; Xu, An; Chang, Yizhe
2018-07-01
The Perron-Frobenius eigenvalue (PFE) is widely used as measurement of the number of loops in networks, but what exactly the relationship between the PFE and the number of loops in networks is has not been researched yet, is it strictly monotonically increasing? And what are the relationships between the PFE and other measurements of loops in networks? Such as the average loop degree of nodes, and the distribution of loop ranks. We make researches on these questions based on samples of ER random network, NW small-world network and BA scale-free network, and the results confirm that, both the number of loops in network and the average loop degree of nodes of all samples do increase with the increase of the PFE in general trend, but neither of them are strictly monotonically increasing, so the PFE is capable to be used as a rough estimative measurement of the number of loops in networks and the average loop degree of nodes. Furthermore, we find that a majority of the loop ranks of all samples obey Weibull distribution, of which the scale parameter A and the shape parameter B have approximate power-law relationships with the PFE of the samples.
Two-state Markov-chain Poisson nature of individual cellphone call statistics
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Zhou, Wei-Xing; Sornette, Didier
2016-07-01
Unfolding the burst patterns in human activities and social interactions is a very important issue especially for understanding the spreading of disease and information and the formation of groups and organizations. Here, we conduct an in-depth study of the temporal patterns of cellphone conversation activities of 73 339 anonymous cellphone users, whose inter-call durations are Weibull distributed. We find that the individual call events exhibit a pattern of bursts, that high activity periods are alternated with low activity periods. In both periods, the number of calls are exponentially distributed for individuals, but power-law distributed for the population. Together with the exponential distributions of inter-call durations within bursts and of the intervals between consecutive bursts, we demonstrate that the individual call activities are driven by two independent Poisson processes, which can be combined within a minimal model in terms of a two-state first-order Markov chain, giving significant fits for nearly half of the individuals. By measuring directly the distributions of call rates across the population, which exhibit power-law tails, we purport the existence of power-law distributions, via the ‘superposition of distributions’ mechanism. Our findings shed light on the origins of bursty patterns in other human activities.
Ulusoy, Nuran
2017-01-01
The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM) materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP). Two types of designs without and with 3 mm intraradicular extensions, endocrown (E) and modified endocrown (ME), were modeled on a 3D Finite element (FE) model of the MFP. Vitablocks Mark II (VMII), Vita Enamic (VE), and Lava Ultimate (LU) CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU. PMID:29119108
NASA Astrophysics Data System (ADS)
Starn, J. J.; Belitz, K.; Carlson, C.
2017-12-01
Groundwater residence-time distributions (RTDs) are critical for assessing susceptibility of water resources to contamination. This novel approach for estimating regional RTDs was to first simulate groundwater flow using existing regional digital data sets in 13 intermediate size watersheds (each an average of 7,000 square kilometers) that are representative of a wide range of glacial systems. RTDs were simulated with particle tracking. We refer to these models as "general models" because they are based on regional, as opposed to site-specific, digital data. Parametric RTDs were created from particle RTDs by fitting 1- and 2-component Weibull, gamma, and inverse Gaussian distributions, thus reducing a large number of particle travel times to 3 to 7 parameters (shape, location, and scale for each component plus a mixing fraction) for each modeled area. The scale parameter of these distributions is related to the mean exponential age; the shape parameter controls departure from the ideal exponential distribution and is partly a function of interaction with bedrock and with drainage density. Given the flexible shape and mathematical similarity of these distributions, any of them are potentially a good fit to particle RTDs. The 1-component gamma distribution provided a good fit to basin-wide particle RTDs. RTDs at monitoring wells and streams often have more complicated shapes than basin-wide RTDs, caused in part by heterogeneity in the model, and generally require 2-component distributions. A machine learning model was trained on the RTD parameters using features derived from regionally available watershed characteristics such as recharge rate, material thickness, and stream density. RTDs appeared to vary systematically across the landscape in relation to watershed features. This relation was used to produce maps of useful metrics with respect to risk-based thresholds, such as the time to first exceedance, time to maximum concentration, time above the threshold (exposure time), and the time until last exceedance; thus, the parameters of groundwater residence time are measures of the intrinsic susceptibility of groundwater to contamination.
SEEK: A FORTRAN optimization program using a feasible directions gradient search
NASA Technical Reports Server (NTRS)
Savage, M.
1995-01-01
This report describes the use of computer program 'SEEK' which works in conjunction with two user-written subroutines and an input data file to perform an optimization procedure on a user's problem. The optimization method uses a modified feasible directions gradient technique. SEEK is written in ANSI standard Fortran 77, has an object size of about 46K bytes, and can be used on a personal computer running DOS. This report describes the use of the program and discusses the optimizing method. The program use is illustrated with four example problems: a bushing design, a helical coil spring design, a gear mesh design, and a two-parameter Weibull life-reliability curve fit.
Derived distribution of floods based on the concept of partial area coverage with a climatic appeal
NASA Astrophysics Data System (ADS)
Iacobellis, Vito; Fiorentino, Mauro
2000-02-01
A new rationale for deriving the probability distribution of floods and help in understanding the physical processes underlying the distribution itself is presented. On the basis of this a model that presents a number of new assumptions is developed. The basic ideas are as follows: (1) The peak direct streamflow Q can always be expressed as the product of two random variates, namely, the average runoff per unit area ua and the peak contributing area a; (2) the distribution of ua conditional on a can be related to that of the rainfall depth occurring in a duration equal to a characteristic response time тa of the contributing part of the basin; and (3) тa is assumed to vary with a according to a power law. Consequently, the probability density function of Q can be found as the integral, over the total basin area A of that of a times the density function of ua given a. It is suggested that ua can be expressed as a fraction of the excess rainfall and that the annual flood distribution can be related to that of Q by the hypothesis that the flood occurrence process is Poissonian. In the proposed model it is assumed, as an exploratory attempt, that a and ua are gamma and Weibull distributed, respectively. The model was applied to the annual flood series of eight gauged basins in Basilicata (southern Italy) with catchment areas ranging from 40 to 1600 km2. The results showed strong physical consistence as the parameters tended to assume values in good agreement with well-consolidated geomorphologic knowledge and suggested a new key to understanding the climatic control of the probability distribution of floods.
Jeffrey H. Gove
2003-01-01
Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...
Pocket Handbook on Reliability
1975-09-01
exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future
Height extrapolation of wind data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhail, A.S.
1982-11-01
Hourly average data for a period of 1 year from three tall meteorological towers - the Erie tower in Colorado, the Goodnoe Hills tower in Washington and the WKY-TV tower in Oklahoma - were used to analyze the wind shear exponent variabiilty with various parameters such as thermal stability, anemometer level wind speed, projection height and surface roughness. Different proposed models for prediction of height variability of short-term average wind speeds were discussed. Other models that predict the height dependence of Weilbull distribution parameters were tested. The observed power law exponent for all three towers showed strong dependence on themore » anemometer level wind speed and stability (nighttime and daytime). It also exhibited a high degree of dependence on extrapolation height with respect to anemometer height. These dependences became less severe as the anemometer level wind speeds were increased due to the turbulent mixing of the atmospheric boundary layer. The three models used for Weibull distribution parameter extrapolation were he velocity-dependent power law model (Justus), the velocity, surface roughness, and height-dependent model (Mikhail) and the velocity and surface roughness-dependent model (NASA). The models projected the scale parameter C fairly accurately for the Goodnoe Hills and WKY-TV towers and were less accurate for the Erie tower. However, all models overestimated the C value. The maximum error for the Mikhail model was less than 2% for Goodnoe Hills, 6% for WKY-TV and 28% for Erie. The error associated with the prediction of the shape factor (K) was similar for the NASA, Mikhail and Justus models. It ranged from 20 to 25%. The effect of the misestimation of hub-height distribution parameters (C and K) on average power output is briefly discussed.« less
The Dynamics of Conditioning and Extinction
Killeen, Peter R.; Sanabria, Federico; Dolgov, Igor
2009-01-01
Pigeons responded to intermittently reinforced classical conditioning trials with erratic bouts of responding to the CS. Responding depended on whether the prior trial contained a peck, food, or both. A linear-persistence/learning model moved animals into and out of a response state, and a Weibull distribution for number of within-trial responses governed in-state pecking. Variations of trial and inter-trial durations caused correlated changes in rate and probability of responding, and model parameters. A novel prediction—in the protracted absence of food, response rates can plateau above zero—was validated. The model predicted smooth acquisition functions when instantiated with the probability of food, but a more accurate jagged learning curve when instantiated with trial-to-trial records of reinforcement. The Skinnerian parameter was dominant only when food could be accelerated or delayed by pecking. These experiments provide a framework for trial-by-trial accounts of conditioning and extinction that increases the information available from the data, permitting them to comment more definitively on complex contemporary models of momentum and conditioning. PMID:19839699
An Australian stocks and flows model for asbestos.
Donovan, Sally; Pickin, Joe
2016-10-01
All available data on asbestos consumption in Australia were collated in order to determine the most common asbestos-containing materials remaining in the built environment. The proportion of asbestos contained within each material and the types of products these materials are most commonly found in was also determined. The lifetime of these asbestos containing products was estimated in order to develop a model that projects stocks and flows of asbestos products in Australia through to the year 2100. The model is based on a Weibull distribution and was built in an excel spreadsheet to make it user-friendly and accessible. The nature of the products under consideration means both their asbestos content and lifetime parameters are highly variable, and so for each of these a high and low estimate is presented along with the estimate used in the model. The user is able to vary the parameters in the model as better data become available. © The Author(s) 2016.
Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.
1995-01-01
The general goal of this project is to establish design protocols that enable the engineer to analyze and predict certain types of behavior in ceramic composites. Sections of the final report addresses the following: Description of the Problem that Motivated the Technology Development, Description of the New Technology that was Developed, Unique and Novel Features of the Technology and Results/Benefits of Application (year by year accomplishments), and Utilization of New Technology in Non-Aerospace Applications. Activities for this reporting period included the development of a design analysis as part of a cooperative agreement with general Electric Aircraft Engines. The effort focused on modifying the Toughened Ceramics Analysis and Reliability Evaluation of Structures (TCARES) algorithm for use in the design of engine components fabricated from NiAl. Other activities related to the development of an ASTM standard practice for estimating Weibull parameters. The standard focuses on the evaluation and reporting of uniaxial strength data, and the estimation of probability distribution parameters for ceramics which fail in a brittle fashion.
Tensile bond strength of filled and unfilled adhesives to dentin.
Braga, R R; Cesar, P F; Gonzaga, C C
2000-04-01
To determine the tensile bond strength of three filled and two unfilled adhesives applied to bovine dentin. Fragments of the labial dentin of bovine incisors were embedded in PVC cylinders with self-cure acrylic resin, and ground flat using 200 grit and 600 grit sandpaper. The following adhesive systems were tested (n=10): Prime & Bond NT, Prime & Bond NT dual cure, Prime & Bond 2.1, OptiBond Solo and Single Bond. A 3 mm-diameter bonding surface was delimited using a perforated adhesive tape. After etching with 37% phosphoric acid and adhesive application, a resin-based composite truncated cone (TPH, shade A3) was built. Tensile test was performed after 24 hrs storage in distilled water at 37 degrees C. Failure mode was accessed using a x10 magnification stereomicroscope. Weibull statistical analysis revealed significant differences in the characteristic strength between Single Bond and Prime & Bond NT dual cure, and between Single Bond and Prime & Bond 2.1. The Weibull parameter (m) was statistically similar among the five groups. Single Bond and Prime & Bond NT showed areas of dentin cohesive failure in most of the specimens. For OptiBond Solo, Prime & Bond NT dual cure and Prime & Bond 2.1 failure was predominantly adhesive.
Failure statistics for commercial lithium ion batteries: A study of 24 pouch cells
NASA Astrophysics Data System (ADS)
Harris, Stephen J.; Harris, David J.; Li, Chen
2017-02-01
There are relatively few publications that assess capacity decline in enough commercial cells to quantify cell-to-cell variation, but those that do show a surprisingly wide variability. Capacity curves cross each other often, a challenge for efforts to measure the state of health and predict the remaining useful life (RUL) of individual cells. We analyze capacity fade statistics for 24 commercial pouch cells, providing an estimate for the time to 5% failure. Our data indicate that RUL predictions based on remaining capacity or internal resistance are accurate only once the cells have already sorted themselves into "better" and "worse" ones. Analysis of our failure data, using maximum likelihood techniques, provide uniformly good fits for a variety of definitions of failure with normal and with 2- and 3-parameter Weibull probability density functions, but we argue against using a 3-parameter Weibull function for our data. pdf fitting parameters appear to converge after about 15 failures, although business objectives should ultimately determine whether data from a given number of batteries provides sufficient confidence to end lifecycle testing. Increased efforts to make batteries with more consistent lifetimes should lead to improvements in battery cost and safety.
Development of a subway operation incident delay model using accelerated failure time approaches.
Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang
2014-12-01
This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Statistical modeling of optical attenuation measurements in continental fog conditions
NASA Astrophysics Data System (ADS)
Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad
2017-03-01
Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.
Janković, Bojan
2011-10-01
The non-isothermal pyrolysis kinetics of Acetocell (the organosolv) and Lignoboost® (kraft) lignins, in an inert atmosphere, have been studied by thermogravimetric analysis. Using isoconversional analysis, it was concluded that the apparent activation energy for all lignins strongly depends on conversion, showing that the pyrolysis of lignins is not a single chemical process. It was identified that the pyrolysis process of Acetocell and Lignoboost® lignin takes place over three reaction steps, which was confirmed by appearance of the corresponding isokinetic relationships (IKR). It was found that major pyrolysis stage of both lignins is characterized by stilbene pyrolysis reactions, which were subsequently followed by decomposition reactions of products derived from the stilbene pyrolytic process. It was concluded that non-isothermal pyrolysis of Acetocell and Lignoboost® lignins can be best described by n-th (n>1) reaction order kinetics, using the Weibull mixture model (as distributed reactivity model) with alternating shape parameters. Copyright © 2011 Elsevier Ltd. All rights reserved.
Leung, Brian T W; Tsoi, James K H; Matinlinna, Jukka P; Pow, Edmond H N
2015-09-01
Fluorophlogopite glass ceramic (FGC) is a biocompatible, etchable, and millable ceramic with fluoride releasing property. However, its mechanical properties and reliability compared with other machinable ceramics remain undetermined. The purpose of this in vitro study was to compare the mechanical properties of 3 commercially available millable ceramic materials, IPS e.max CAD, Vitablocs Mark II, and Vita Enamic, with an experimental FGC. Each type of ceramic block was sectioned into beams (n=15) of standard dimensions of 2×2×15 mm. Before mechanical testing, specimens of the IPS e.max CAD group were further fired for final crystallization. Flexural strength was determined by the 3-point bend test with a universal loading machine at a cross head speed of 1 mm/min. Hardness was determined with a hardness tester with 5 Vickers hardness indentations (n=5) using a 1.96 N load and a dwell time of 15 seconds. Selected surfaces were examined by scanning electron microscopy and energy-dispersive x-ray spectroscopy. Data were analyzed by the 1-way ANOVA test and Weibull analysis (α=.05). Weibull parameters, including the Weibull modulus (m) as well as the characteristic strength at 63.2% (η) and 10.0% (B10), were obtained. A significant difference in flexural strength (P<.001) was found among groups, with IPS e.max CAD (341.88 ±40.25 MPa)>Vita Enamic (145.95 ±12.65 MPa)>Vitablocs Mark II (106.67 ±18.50 MPa), and FGC (117.61 ±7.62 MPa). The Weibull modulus ranged from 6.93 to 18.34, with FGC showing the highest Weibull modulus among the 4 materials. The Weibull plot revealed that IPS e.max CAD>Vita Enamic>FGC>Vitablocs Mark II for the characteristic strength at both 63.2% (η) and 10.0% (B10). Significant difference in Vickers hardness among groups (P<.001) was found with IPS e.max CAD (731.63 ±30.64 H(V))>Vitablocs Mark II (594.74 ±25.22 H(V))>Vita Enamic (372.29 ±51.23 H(V))>FGC (153.74 ±23.62 H(V)). The flexural strength and Vickers hardness of IPS e.max CAD were significantly higher than those of the 3 materials tested. The FGC's flexural strength was comparable with Vitablocs Mark II. The FGC's Weibull modulus was the highest, while its Vickers hardness was the lowest among the materials tested. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
NASA Astrophysics Data System (ADS)
Zhang, Kai; Zhao, Chun; Wan, Hui; Qian, Yun; Easter, Richard C.; Ghan, Steven J.; Sakaguchi, Koichi; Liu, Xiaohong
2016-02-01
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography over land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. In Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
Zhang, Kai; Zhao, Chun; Wan, Hui; ...
2016-02-12
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography overmore » land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. Lastly, in Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.« less
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Kai; Zhao, Chun; Wan, Hui
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography overmore » land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. Lastly, in Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.« less
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Zhou, Wei-Xing; Tan, Qun-Zhao
2009-11-01
Massive multiplayer online role-playing games (MMORPGs) are very popular in China, which provides a potential platform for scientific research. We study the online-offline activities of avatars in an MMORPG to understand their game-playing behavior. The statistical analysis unveils that the active avatars can be classified into three types. The avatars of the first type are owned by game cheaters who go online and offline in preset time intervals with the online duration distributions dominated by pulses. The second type of avatars is characterized by a Weibull distribution in the online durations, which is confirmed by statistical tests. The distributions of online durations of the remaining individual avatars differ from the above two types and cannot be described by a simple form. These findings have potential applications in the game industry.
Effect of the microstructure on the lifetime of dental ceramics.
Borba, Márcia; de Araújo, Maico D; Fukushima, Karen A; Yoshimura, Humberto N; Cesar, Paulo F; Griggs, Jason A; Della Bona, Alvaro
2011-07-01
To evaluate the effect of the microstructure on the Weibull and slow crack growth (SCG) parameters and on the lifetime of three ceramics used as framework materials for fixed partial dentures (FPDs) (YZ - Vita In-Ceram YZ; IZ - Vita In-Ceram Zirconia; AL - Vita In-Ceram AL) and of two veneering porcelains (VM7 and VM9). Bar-shaped specimens were fabricated according to the manufacturer's instructions. Specimens were tested in three-point flexure in 37°C artificial saliva. Weibull analysis (n=30) and a constant stress-rate test (n=10) were used to determine the Weibull modulus (m) and SCG coefficient (n), respectively. Microstructural and fractographic analyzes were performed using SEM. ANOVA and Tukey's test (α=0.05) were used to statistically analyze data obtained with both microstructural and fractographic analyzes. YZ and AL presented high crystalline content and low porosity (0.1-0.2%). YZ had the highest characteristic strength (σ(0)) value (911MPa) followed by AL (488MPa) and IZ (423MPa). Lower σ(0) values were observed for the porcelains (68-75MPa). Except for IZ and VM7, m values were similar among the ceramic materials. Higher n values were found for YZ (76) and AL (72), followed by IZ (54) and the veneering materials (36-44). Lifetime predictions showed that YZ was the material with the best mechanical performance. The size of the critical flaw was similar among the framework materials (34-48μm) and among the porcelains (75-86μm). The microstructure influenced the mechanical and SCG behavior of the studied materials and, consequently, the lifetime predictions. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Random local temporal structure of category fluency responses.
Meyer, David J; Messer, Jason; Singh, Tanya; Thomas, Peter J; Woyczynski, Wojbor A; Kaye, Jeffrey; Lerner, Alan J
2012-04-01
The Category Fluency Test (CFT) provides a sensitive measurement of cognitive capabilities in humans related to retrieval from semantic memory. In particular, it is widely used to assess progress of cognitive impairment in patients with dementia. Previous research shows that, in the first approximation, the intensity of tested individuals' responses within a standard 60-s test period decays exponentially with time, with faster decay rates for more cognitively impaired patients. Such decay rate can then be viewed as a global (macro) diagnostic parameter of each test. In the present paper we focus on the statistical properties of the properly de-trended time intervals between consecutive responses (inter-call times) in the Category Fluency Test. In a sense, those properties reflect the local (micro) structure of the response generation process. We find that a good approximation for the distribution of the de-trended inter-call times is provided by the Weibull Distribution, a probability distribution that appears naturally in this context as a distribution of a minimum of independent random quantities and is the standard tool in industrial reliability theory. This insight leads us to a new interpretation of the concept of "navigating a semantic space" via patient responses.
NASA Technical Reports Server (NTRS)
Leybold, H. A.
1971-01-01
Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.
NASA Astrophysics Data System (ADS)
Mathieu, Jean-Philippe; Inal, Karim; Berveiller, Sophie; Diard, Olivier
2010-11-01
Local approach to brittle fracture for low-alloyed steels is discussed in this paper. A bibliographical introduction intends to highlight general trends and consensual points of the topic and evokes debatable aspects. French RPV steel 16MND5 (equ. ASTM A508 Cl.3), is then used as a model material to study the influence of temperature on brittle fracture. A micromechanical modelling of brittle fracture at the elementary volume scale already used in previous work is then recalled. It involves a multiscale modelling of microstructural plasticity which has been tuned on experimental inter-phase and inter-granular stresses heterogeneities measurements. Fracture probability of the elementary volume can then be computed using a randomly attributed defect size distribution based on realistic carbides repartition. This defect distribution is then deterministically correlated to stress heterogeneities simulated within the microstructure using a weakest-link hypothesis on the elementary volume, which results in a deterministic stress to fracture. Repeating the process allows to compute Weibull parameters on the elementary volume. This tool is then used to investigate the physical mechanisms that could explain the already experimentally observed temperature dependence of Beremin's parameter for 16MND5 steel. It is showed that, assuming that the hypothesis made in this work about cleavage micro-mechanisms are correct, effective equivalent surface energy (i.e. surface energy plus plastically dissipated energy when blunting the crack tip) for propagating a crack has to be temperature dependent to explain Beremin's parameters temperature evolution.
The Wind Energy Potential of Kurdistan, Iran
Arefi, Farzad; Moshtagh, Jamal; Moradi, Mohammad
2014-01-01
In the current work by using statistical methods and available software, the wind energy assessment of prone regions for installation of wind turbines in, Qorveh, has been investigated. Information was obtained from weather stations of Baneh, Bijar, Zarina, Saqez, Sanandaj, Qorveh, and Marivan. The monthly average and maximum of wind speed were investigated between the years 2000–2010 and the related curves were drawn. The Golobad curve (direction and percentage of dominant wind and calm wind as monthly rate) between the years 1997–2000 was analyzed and drawn with plot software. The ten-minute speed (at 10, 30, and 60 m height) and direction (at 37.5 and 10 m height) wind data were collected from weather stations of Iranian new energy organization. The wind speed distribution during one year was evaluated by using Weibull probability density function (two-parametrical), and the Weibull curve histograms were drawn by MATLAB software. According to the average wind speed of stations and technical specifications of the types of turbines, the suitable wind turbine for the station was selected. Finally, the Divandareh and Qorveh sites with favorable potential were considered for installation of wind turbines and construction of wind farms. PMID:27355042
NASA Astrophysics Data System (ADS)
Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng
2017-12-01
Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.
Modelling the Ozone-Based Treatments for Inactivation of Microorganisms.
Brodowska, Agnieszka Joanna; Nowak, Agnieszka; Kondratiuk-Janyska, Alina; Piątkowski, Marcin; Śmigielski, Krzysztof
2017-10-09
The paper presents the development of a model for ozone treatment in a dynamic bed of different microorganisms ( Bacillus subtilis , B. cereus , B. pumilus , Escherichia coli , Pseudomonas fluorescens , Aspergillus niger , Eupenicillium cinnamopurpureum ) on a heterogeneous matrix (juniper berries, cardamom seeds) initially treated with numerous ozone doses during various contact times was studied. Taking into account various microorganism susceptibility to ozone, it was of great importance to develop a sufficiently effective ozone dose to preserve food products using different strains based on the microbial model. For this purpose, we have chosen the Weibull model to describe the survival curves of different microorganisms. Based on the results of microorganism survival modelling after ozone treatment and considering the least susceptible strains to ozone, we selected the critical ones. Among tested strains, those from genus Bacillus were recognized as the most critical strains. In particular, B. subtilis and B. pumilus possessed the highest resistance to ozone treatment because the time needed to achieve the lowest level of its survival was the longest (up to 17.04 min and 16.89 min for B. pumilus reduction on juniper berry and cardamom seed matrix, respectively). Ozone treatment allow inactivate microorganisms to achieving lower survival rates by ozone dose (20.0 g O₃/m³ O₂, with a flow rate of 0.4 L/min) and contact time (up to 20 min). The results demonstrated that a linear correlation between parameters p and k in Weibull distribution, providing an opportunity to calculate a fitted equation of the process.
A Modeling Approach to Fiber Fracture in Melt Impregnation
NASA Astrophysics Data System (ADS)
Ren, Feng; Zhang, Cong; Yu, Yang; Xin, Chunling; Tang, Ke; He, Yadong
2017-02-01
The effect of process variables such as roving pulling speed, melt temperature and number of pins on the fiber fracture during the processing of thermoplastic based composites was investigated in this study. The melt impregnation was used in this process of continuous glass fiber reinforced thermoplastic composites. Previous investigators have suggested a variety of models for melt impregnation, while comparatively little effort has been spent on modeling the fiber fracture caused by the viscous resin. Herein, a mathematical model was developed for impregnation process to predict the fiber fracture rate and describe the experimental results with the Weibull intensity distribution function. The optimal parameters of this process were obtained by orthogonal experiment. The results suggest that the fiber fracture is caused by viscous shear stress on fiber bundle in melt impregnation mold when pulling the fiber bundle.
Structural Design Parameters for Germanium
NASA Technical Reports Server (NTRS)
Salem, Jon; Rogers, Richard; Baker, Eric
2017-01-01
The fracture toughness and slow crack growth parameters of germanium supplied as single crystal beams and coarse grain disks were measured. Although germanium is anisotropic (A* 1.7), it is not as anisotropic as SiC, NiAl, or Cu. Thus the fracture toughness was similar on the 100, 110, and 111 planes, however, measurements associated with randomly oriented grinding cracks were 6 to 30 higher. Crack extension in ring loaded disks occurred on the 111 planes due to both the lower fracture energy and the higher stresses on stiff 111 planes. Germanium exhibits a Weibull scale effect, but does not exhibit significant slow crack growth in distilled water. (n 100), implying that design for quasi static loading can be performed with scaled strength statistics. Practical values for engineering design are a fracture toughness of 0.69 0.02 MPam (megapascals per square root meter) and a Weibull modulus of m 6 2. For well ground and reasonable handled coupons, average fracture strength should be greater than 40 megapascals. Aggregate, polycrystalline elastic constants are Epoly 131 gigapascals, vpoly 0.22.
NASA Astrophysics Data System (ADS)
Pisano, Luca; Vessia, Giovanna; Vennari, Carmela; Parise, Mario
2015-04-01
Empirical rainfall thresholds are a well established method to draw information about Duration (D) and Cumulated (E) values of the rainfalls that are likely to initiate shallow landslides. To this end, rain-gauge records of rainfall heights are commonly used. Several procedures can be applied to address the calculation of the Duration-Cumulated height and, eventually, the Intensity values related to the rainfall events responsible for shallow landslide onset. A large number of procedures are drawn from particular geological settings and climate conditions based on an expert identification of the rainfall event. A few researchers recently devised automated procedures to reconstruct the rainfall events responsible for landslide onset. In this study, 300 pairs of D, E couples, related to shallow landslides that occurred in a ten year span 2002-2012 on the Italian territory, have been drawn by means of two procedures: the expert method (Brunetti et al., 2010) and the automated method (Vessia et al., 2014). The two procedures start from the same sources of information on shallow landslides occurred during or soon after a rainfall. Although they have in common the method to select the date (up to the hour of the landslide occurrence), the site of the landslide and the choice of the rain-gauge representative for the rainfall, they differ when calculating the Duration and Cumulated height of the rainfall event. Moreover, the expert procedure identifies only one D, E pair for each landslide whereas the automated procedure draws 6 possible D,E pairs for the same landslide event. Each one of the 300 D, E pairs calculated by the automated procedure reproduces about 80% of the E values and about 60% of the D values calculated by the expert procedure. Unfortunately, no standard methods are available for checking the forecasting ability of both the expert and the automated reconstruction of the true D, E pairs that result in shallow landslide. Nonetheless, a statistical analysis on marginal distributions of the seven samples of 300 D and E values are performed in this study. The main objective of this statistical analysis is to highlight similarities and differences in the two sets of samples of Duration and Cumulated values collected by the two procedures. At first, the sample distributions have been investigated: the seven E samples are Lognormal distributed, whereas the D samples are all distributed Weibull like. On E samples, due to their Lognormal distribution, statistical tests can be applied to check two null hypotheses: equal mean values through the Student test, equal standard deviations through the Fisher test. These two hypotheses are accepted for the seven E samples, meaning that they come from the same population, at a confidence level of 95%. Conversely, the preceding tests cannot be applied to the seven D samples that are Weibull distributed with shape parameters k ranging between 0.9 to 1.2. Nonetheless, the two procedures calculate the rainfall event through the selection of the E values; after that the D is drawn. Thus, the results of this statistical analysis preliminary confirms the similarities of the two D,E pair set of values drawn from the two different procedures. References Brunetti, M.T., Peruccacci, S., Rossi, M., Luciani, S., Valigi, D., and Guzzetti, F.: Rainfall thresholds for the possible occurrence of landslides in Italy, Nat. Hazards Earth Syst. Sci., 10, 447-458, doi:10.5194/nhess-10-447-2010, 2010. Vessia G., Parise M., Brunetti M.T., Peruccacci S., Rossi M., Vennari C., and Guzzetti F.: Automated reconstruction of rainfall events responsible for shallow landslides, Nat. Hazards Earth Syst. Sci., 14, 2399-2408, doi: 10.5194/nhess-14-2399-2014, 2014.
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.
Cai, Jing; Li, Shan; Zhang, Haixin; Zhang, Shuoxin; Tyree, Melvin T
2014-01-01
Vulnerability curves (VCs) generally can be fitted to the Weibull equation; however, a growing number of VCs appear to be recalcitrant, that is, deviate from a Weibull but seem to fit dual Weibull curves. We hypothesize that dual Weibull curves in Hippophae rhamnoides L. are due to different vessel diameter classes, inter-vessel hydraulic connections or vessels versus fibre tracheids. We used dye staining techniques, hydraulic measurements and quantitative anatomy measurements to test these hypotheses. The fibres contribute 1.3% of the total stem conductivity, which eliminates the hypothesis that fibre tracheids account for the second Weibull curve. Nevertheless, the staining pattern of vessels and fibre tracheids suggested that fibres might function as a hydraulic bridge between adjacent vessels. We also argue that fibre bridges are safer than vessel-to-vessel pits and put forward the concept as a new paradigm. Hence, we tentatively propose that the first Weibull curve may be accounted by vessels connected to each other directly by pit fields, while the second Weibull curve is associated with vessels that are connected almost exclusively by fibre bridges. Further research is needed to test the concept of fibre bridge safety in species that have recalcitrant or normal Weibull curves. © 2013 John Wiley & Sons Ltd.
Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean
2012-07-01
The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p < 0.0001) than the other two subgroups (subgroups 1.5C and 0.8C). For the ZirCAD group, the 0.8C-0.7VL subgroup had significantly lower flexural strength (p= 0.004) than subgroup 0.8C-0.7VP. Nonetheless, both veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0.87 for e.max pair); however, e.max CAD/Press groups had significantly higher flexural strength (p < 0.0001) than Empress Esthetic/CAD groups. Monolithic core specimens presented with higher Weibull modulus with all selected core materials. For the ZirCAD group, although the bilayer 0.8C-0.7VL subgroup exhibited significantly lower flexural strength, it had highest Weibull modulus than the 0.8C-0.7VP subgroup. The present study suggests that veneering porcelain onto a ceramic core material diminishes the flexural strength and the reliability of the bilayer specimens. Leucite-reinforced glass-ceramic cores have lower flexural strength than lithium-disilicate ones, while fabrication techniques (heat-pressed or CAD/CAM) and specimen thicknesses do not affect the flexural strength of all glass ceramics. Compared with the heat-pressed veneering technique, the powder/liquid veneering technique exhibited lower flexural strength but increased reliability with a higher Weibull modulus for zirconia bilayer specimens. Zirconia-veneered ceramics exhibited greater flexural strength than monolithic leucite-reinforced and lithium-disilicate ceramics regardless of zirconia veneering techniques (heat-pressed or powder/liquid technique). © 2012 by the American College of Prosthodontists.
Statistical behavior of the tensile property of heated cotton fiber
USDA-ARS?s Scientific Manuscript database
The temperature dependence of the tensile property of single cotton fiber was studied in the range of 160-300°C using Favimat test, and its statistical behavior was interpreted in terms of structural changes. The tenacity of control cotton fiber was well described by the single Weibull distribution,...
Abutment design for implant-supported indirect composite molar crowns: reliability and fractography.
Bonfante, Estevam Augusto; Suzuki, Marcelo; Lubelski, William; Thompson, Van P; de Carvalho, Ricardo Marins; Witek, Lukasz; Coelho, Paulo G
2012-12-01
To investigate the reliability of titanium abutments veneered with indirect composites for implant-supported crowns and the possibility to trace back the fracture origin by qualitative fractographic analysis. Large base (LB) (6.4-mm diameter base, with a 4-mm high cone in the center for composite retention), small base (SB-4) (5.2-mm base, 4-mm high cone), and small base with cone shortened to 2 mm (SB-2) Ti abutments were used. Each abutment received incremental layers of indirect resin composite until completing the anatomy of a maxillary molar crown. Step-stress accelerated-life fatigue testing (n = 18 each) was performed in water. Weibull curves with use stress of 200 N for 50,000 and 100,000 cycles were calculated. Probability Weibull plots examined the differences between groups. Specimens were inspected in light-polarized and scanning electron microscopes for fractographic analysis. Use level probability Weibull plots showed Beta values of 0.27 for LB, 0.32 for SB-4, and 0.26 for SB-2, indicating that failures were not influenced by fatigue and damage accumulation. The data replotted as Weibull distribution showed no significant difference in the characteristic strengths between LB (794 N) and SB-4 abutments (836 N), which were both significantly higher than SB-2 (601 N). Failure mode was cohesive within the composite for all groups. Fractographic markings showed that failures initiated at the indentation area and propagated toward the margins of cohesively failed composite. Reliability was not influenced by abutment design. Qualitative fractographic analysis of the failed indirect composite was feasible. © 2012 by the American College of Prosthodontists.
Long-term statistics of extreme tsunami height at Crescent City
NASA Astrophysics Data System (ADS)
Dong, Sheng; Zhai, Jinjin; Tao, Shanshan
2017-06-01
Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.
The Weibull probabilities analysis on the single kenaf fiber
NASA Astrophysics Data System (ADS)
Ibrahim, I.; Sarip, S.; Bani, N. A.; Ibrahim, M. H.; Hassan, M. Z.
2018-05-01
Kenaf fiber has a great potential to be replaced with the synthetic composite due to their advantages such as environmentally friendly and outstanding performance. However, the main issue of this natural fiber that to be used in structural composite is inconsistency of their mechanical properties. Here, the influence of the gage length on the mechanical properties of single kenaf fiber was evaluated. This fiber was tested using the Universal testing machine at a loading rate of 1mm per min following ASTM D3822 standard. In this study, the different length of treated fiber including 20, 30 and 40mm were being tested. Following, Weibull probabilities analysis was used to characterize the tensile strength and Young modulus of kenaf fiber. The predicted average tensile strength from this approach is in good agreement with experimental results for the obtained parameter.
Reliability Based Design for a Raked Wing Tip of an Airframe
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.
2011-01-01
A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.
How extreme is extreme hourly precipitation?
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos
2016-04-01
The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.
Parametric regression model for survival data: Weibull regression model as an example
2016-01-01
Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846
Multiscale modeling of porous ceramics using movable cellular automaton method
NASA Astrophysics Data System (ADS)
Smolin, Alexey Yu.; Smolin, Igor Yu.; Smolina, Irina Yu.
2017-10-01
The paper presents a multiscale model for porous ceramics based on movable cellular automaton method, which is a particle method in novel computational mechanics of solid. The initial scale of the proposed approach corresponds to the characteristic size of the smallest pores in the ceramics. At this scale, we model uniaxial compression of several representative samples with an explicit account of pores of the same size but with the unique position in space. As a result, we get the average values of Young's modulus and strength, as well as the parameters of the Weibull distribution of these properties at the current scale level. These data allow us to describe the material behavior at the next scale level were only the larger pores are considered explicitly, while the influence of small pores is included via effective properties determined earliar. If the pore size distribution function of the material has N maxima we need to perform computations for N-1 levels in order to get the properties step by step from the lowest scale up to the macroscale. The proposed approach was applied to modeling zirconia ceramics with bimodal pore size distribution. The obtained results show correct behavior of the model sample at the macroscale.
A statistical comparison of two carbon fiber/epoxy fabrication techniques
NASA Technical Reports Server (NTRS)
Hodge, A. J.
1991-01-01
A statistical comparison of the compression strengths of specimens that were fabricated by either a platen press or an autoclave were performed on IM6/3501-6 carbon/epoxy composites of 16-ply (0,+45,90,-45)(sub S2) lay-up configuration. The samples were cured with the same parameters and processing materials. It was found that the autoclaved panels were thicker than the platen press cured samples. Two hundred samples of each type of cure process were compression tested. The autoclaved samples had an average strength of 450 MPa (65.5 ksi), while the press cured samples had an average strength of 370 MPa (54.0 ksi). A Weibull analysis of the data showed that there is only a 30 pct. probability that the two types of cure systems yield specimens that can be considered from the same family.
Categorical Data Analysis Using a Skewed Weibull Regression Model
NASA Astrophysics Data System (ADS)
Caron, Renault; Sinha, Debajyoti; Dey, Dipak; Polpo, Adriano
2018-03-01
In this paper, we present a Weibull link (skewed) model for categorical response data arising from binomial as well as multinomial model. We show that, for such types of categorical data, the most commonly used models (logit, probit and complementary log-log) can be obtained as limiting cases. We further compare the proposed model with some other asymmetrical models. The Bayesian as well as frequentist estimation procedures for binomial and multinomial data responses are presented in details. The analysis of two data sets to show the efficiency of the proposed model is performed.
Assessment of PM10 pollution level and required source emission reduction in Belgrade area.
Todorović, Marija N; Perišić, Mirjana D; Kuzmanoski, Maja M; Stojić, Andreja M; Sostarić, Andrej I; Mijić, Zoran R; Rajšić, Slavica F
2015-01-01
The aim of this study was to assess PM10 pollution level and estimate required source emission reduction in Belgrade area, the second largest urban center in the Balkans. Daily mass concentrations and trace metal content (As, Cd, Cr, Mn, Ni, Pb) of PM10 were evaluated for three air quality monitoring sites of different types: urban-traffic (Slavija), suburban (Lazarevac) and rural (Grabovac) under the industrial influence, during the period of 2012-13. Noncompliance with current Air Quality Standards (AQS) was noticeable: annual means were higher than AQS at Slavija and Lazarevac, and daily frequency threshold was exceeded at all three locations. Annual means of As at Lazarevac were about four times higher than the target concentration, which could be attributed to the proximity of coal-fired power plants, and dust resuspension from coal basin and nearby ash landfills. Additionally, levels of Ni and Cr were significantly higher than in other European cities. Carcinogenic health risk of inhabitants' exposure to trace metals was assessed as well. Cumulative cancer risk exceeded the upper limit of acceptable US EPA range at two sites, with Cr and As as the major contributors. To estimate source emission reduction, required to meet AQS, lognormal, Weibull and Pearson 5 probability distribution, functions (PDF) were used to fit daily PM10 concentrations. Based on the rollback equation and best fitting PDF, estimated reduction was within the range of 28-98%. Finally, the required reduction obtained using two-parameter exponential distribution suggested that risks associated to accidental releases of pollutants should be of greater concern.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko
2014-01-01
Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines.
EU-Norsewind Using Envisat ASAR And Other Data For Offshore Wind Atlas
NASA Astrophysics Data System (ADS)
Hasager, Charlotte B.; Mouche, Alexis; Badger, Merete
2010-04-01
The EU project NORSEWIND - short for Northern Seas Wind Index Database - www.norsewind.eu has the aim to produce state-of-the-art wind atlas for the Baltic, Irish and North Seas using ground-based lidar, meteorological masts, satellite data and mesoscale modelling. So far CLS and Risø DTU have collected Envisat ASAR images for the area of interest and the first results: maps of wind statistics, Weibull scale and shape parameters, mean and energy density are presented. The results will be compared to a distributed network of high-quality in-situ observations and mesoscale model results during 2009-2011 as the in-situ data and model results become available. Wind energy is proportional with wind speed to the third power, thus even small improvements on wind speed mapping are important in this project. One challenge is to arrive at hub-height winds ~100 m above sea level.
Statistical damage constitutive model for rocks subjected to cyclic stress and cyclic temperature
NASA Astrophysics Data System (ADS)
Zhou, Shu-Wei; Xia, Cai-Chu; Zhao, Hai-Bin; Mei, Song-Hua; Zhou, Yu
2017-10-01
A constitutive model of rocks subjected to cyclic stress-temperature was proposed. Based on statistical damage theory, the damage constitutive model with Weibull distribution was extended. Influence of model parameters on the stress-strain curve for rock reloading after stress-temperature cycling was then discussed. The proposed model was initially validated by rock tests for cyclic stress-temperature and only cyclic stress. Finally, the total damage evolution induced by stress-temperature cycling and reloading after cycling was explored and discussed. The proposed constitutive model is reasonable and applicable, describing well the stress-strain relationship during stress-temperature cycles and providing a good fit to the test results. Elastic modulus in the reference state and the damage induced by cycling affect the shape of reloading stress-strain curve. Total damage induced by cycling and reloading after cycling exhibits three stages: initial slow increase, mid-term accelerated increase, and final slow increase.
Modelling volatility recurrence intervals in the Chinese commodity futures market
NASA Astrophysics Data System (ADS)
Zhou, Weijie; Wang, Zhengxin; Guo, Haiming
2016-09-01
The law of extreme event occurrence attracts much research. The volatility recurrence intervals of Chinese commodity futures market prices are studied: the results show that the probability distributions of the scaled volatility recurrence intervals have a uniform scaling curve for different thresholds q. So we can deduce the probability distribution of extreme events from normal events. The tail of a scaling curve can be well fitted by a Weibull form, which is significance-tested by KS measures. Both short-term and long-term memories are present in the recurrence intervals with different thresholds q, which denotes that the recurrence intervals can be predicted. In addition, similar to volatility, volatility recurrence intervals also have clustering features. Through Monte Carlo simulation, we artificially synthesise ARMA, GARCH-class sequences similar to the original data, and find out the reason behind the clustering. The larger the parameter d of the FIGARCH model, the stronger the clustering effect is. Finally, we use the Fractionally Integrated Autoregressive Conditional Duration model (FIACD) to analyse the recurrence interval characteristics. The results indicated that the FIACD model may provide a method to analyse volatility recurrence intervals.
NASA Astrophysics Data System (ADS)
Kalkanis, G.; Rosso, E.
1989-09-01
Results of an accelerated test on the lifetime of a mylar-polyurethane laminated dc high voltage insulating structure are reported. This structure consists of mylar ribbons placed side by side in a number of layers, staggered and glued together with a polyurethane adhesive. The lifetime until breakdown as a function of extremely high values of voltage stress is measured and represented by a mathematical model, the inverse power law model with a 2-parameter Weibull lifetime distribution. The statistical treatment of the data — either by graphical or by analytical methods — allowed us to estimate the lifetime distribution and confidence bounds for any required normal voltage stress. The laminated structure under consideration is, according to the analysis, a very reliable dc hv insulating material, with a very good life performance according to the inverse power law model, and with an exponent of voltage stress equal to 6. A large insulator of cylindrical shape with this kind of laminated structure can be constructed by winding helically a mylar ribbon in a number of layers.
Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues
NASA Astrophysics Data System (ADS)
Chakravarthy, Srinivas R.; Rumyantsev, Alexander
2018-03-01
Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi
2014-05-01
The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Modelling the Ozone-Based Treatments for Inactivation of Microorganisms
Brodowska, Agnieszka Joanna; Nowak, Agnieszka; Kondratiuk-Janyska, Alina; Piątkowski, Marcin; Śmigielski, Krzysztof
2017-01-01
The paper presents the development of a model for ozone treatment in a dynamic bed of different microorganisms (Bacillus subtilis, B. cereus, B. pumilus, Escherichia coli, Pseudomonas fluorescens, Aspergillus niger, Eupenicillium cinnamopurpureum) on a heterogeneous matrix (juniper berries, cardamom seeds) initially treated with numerous ozone doses during various contact times was studied. Taking into account various microorganism susceptibility to ozone, it was of great importance to develop a sufficiently effective ozone dose to preserve food products using different strains based on the microbial model. For this purpose, we have chosen the Weibull model to describe the survival curves of different microorganisms. Based on the results of microorganism survival modelling after ozone treatment and considering the least susceptible strains to ozone, we selected the critical ones. Among tested strains, those from genus Bacillus were recognized as the most critical strains. In particular, B. subtilis and B. pumilus possessed the highest resistance to ozone treatment because the time needed to achieve the lowest level of its survival was the longest (up to 17.04 min and 16.89 min for B. pumilus reduction on juniper berry and cardamom seed matrix, respectively). Ozone treatment allow inactivate microorganisms to achieving lower survival rates by ozone dose (20.0 g O3/m3 O2, with a flow rate of 0.4 L/min) and contact time (up to 20 min). The results demonstrated that a linear correlation between parameters p and k in Weibull distribution, providing an opportunity to calculate a fitted equation of the process. PMID:28991199
Kose, Eiji; Uno, Kana; Hayashi, Hiroyuki
2017-01-01
Typical antipsychotics are easily expressed as adverse events such as extrapyramidal symptom (EPS). On the other hand, incidence of adverse events due to atypical antipsychotics is low. Therefore, currently, atypical antipsychotics are widely used to treat schizophrenia. However, it has been reported that there is no difference in the frequency of EPS in atypical and typical antipsychotics. This study aimed to evaluate the expression profile of EPS in atypical and typical antipsychotics treatment using the Japanese Adverse Drug Event Report (JADER) database. We analyzed reports of EPS in the JADER database and calculated the reporting odds ratio (ROR) of antipsychotics potentially associated with EPS. We applied the Weibull shape parameter to time-to-event data in the JADER database. Consequently, there was little information to distinguish between the ROR of atypical and typical antipsychotics. A significant difference related to the time of onset of EPS in both antipsychotics was not recognized. However, when comparing each drug, Paliperidone, Perospirone, Blonanserin, and Aripiprazole were relatively developed as EPS in the early stage. On the other hand, Risperidone, Clozapine, Olanzapine, and Quetiapine were developed as EPS not only at an early stage but also after long-term use. In addition, this finding was suggested from the result of the cumulative incidence of EPS in each drug and of the time-to-onset analysis using Weibull distribution. These findings may contribute to future clinical practice because we revealed the expression profile of EPS in treatment with atypical and typical antipsychotics.
A bivariate model for analyzing recurrent multi-type automobile failures
NASA Astrophysics Data System (ADS)
Sunethra, A. A.; Sooriyarachchi, M. R.
2017-09-01
The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by the bivariate model. The proposed model can be used to determine the time and type of failure that would occur in the automobiles considered here.
Nguyen, Harrison H; Fong, Hanson; Paranjpe, Avina; Flake, Natasha M; Johnson, James D; Peters, Ove A
2014-08-01
The purpose of this study was to compare the fracture resistance to cyclic fatigue of ProTaper Next (PTN; Dentsply Tulsa Dental Specialties, Tulsa, OK), ProTaper Universal (PTU, Dentsply Tulsa Dental Specialties), and Vortex Blue (VB, Dentsply Tulsa Dental Specialties) rotary instruments. Twenty instruments each of PTN X1-X5, PTU S1-F5, and VB 20/04-50/04 were rotated until fracture in a simulated canal of 90° and a 5-mm radius using a custom-made testing platform. The number of cycles to fracture (NCF) was calculated. Weibull analysis was used to predict the maximum number of cycles when 99% of the instrument samples survive. VB 20/04-30/04 had significantly higher NCF than PTU S1-F5 and PTN X1-X5. VB 35/04-45/04 had significantly higher NCF than PTU S2-F5 and PTN X2-X5. PTN X1 had higher NCF than PTU S1-F5. PTN X2 had higher NCF than PTU F2-F5. The Weibull distribution predicted the highest number of cycles at which 99% of instruments survive to be 766 cycles for VB 25/04 and the lowest to be 50 cycles for PTU F2. Under the limitations of this study, VB 20/04-45/04 were more resistant to cyclic fatigue than PTN X2-X5 and PTU S2-F5. PTN X1 and X2 were more resistant to cyclic fatigue than PTU F2-F5. The Weibull distribution appears to be a feasible and potentially clinically relevant model to predict resistance to cyclic fatigue. Copyright © 2014 American Association of Endodontists. All rights reserved.
Micro-mechanics of hydro-mechanical coupled processes during hydraulic fracturing in sandstone
NASA Astrophysics Data System (ADS)
Caulk, R.; Tomac, I.
2017-12-01
This contribution presents micro-mechanical study of hydraulic fracture initiation and propagation in sandstone. The Discrete Element Method (DEM) Yade software is used as a tool to model fully coupled hydro-mechanical behavior of the saturated sandstone under pressures typical for deep geo-reservoirs. Heterogeneity of sandstone strength tensile and shear parameters are introduced using statistical representation of cathodoluminiscence (CL) sandstone rock images. Weibull distribution of statistical parameter values was determined as a best match of the CL scans of sandstone grains and cement between grains. Results of hydraulic fracturing stimulation from the well bore indicate significant difference between models with the bond strengths informed from CL scans and uniform homogeneous representation of sandstone parameters. Micro-mechanical insight reveals formed hydraulic fracture typical for mode I or tensile cracking in both cases. However, the shear micro-cracks are abundant in the CL informed model while they are absent in the standard model with uniform strength distribution. Most of the mode II cracks, or shear micro-cracks, are not part of the main hydraulic fracture and occur in the near-tip and near-fracture areas. The position and occurrence of the shear micro-cracks is characterized as secondary effect which dissipates the hydraulic fracturing energy. Additionally, the shear micro-crack locations qualitatively resemble acoustic emission cloud of shear cracks frequently observed in hydraulic fracturing, and sometimes interpreted as re-activation of existing fractures. Clearly, our model does not contain pre-existing cracks and has continuous nature prior to fracturing. This observation is novel and interesting and is quantified in the paper. The shear particle contact forces field reveals significant relaxation compared to the model with uniform strength distribution.
Investigations of subcritical crack propagation of the Empress 2 all-ceramic system.
Mitov, Gergo; Lohbauer, Ulrich; Rabbo, Mohammad Abed; Petschelt, Anselm; Pospiech, Peter
2008-02-01
The mechanical properties and slow crack propapagation of the all-porcelain system Empress 2 (Ivoclar Vivadent, Schaan, Liechtenstein) with its framework compound Empress 2 and the veneering compounds "Empress 2 and Eris were examined. For all materials, the fracture strength, Weibull parameter and elastic moduli were experimentally determined in a four-point-bending test. For the components of the Empress 2 system, the fracture toughness K(IC) was determined, and the crack propagation parameters n and A were determined in a dynamic fatigue method. Using these data, life data analysis was performed and lifetime diagrams were produced. The development of strength under static fatigue conditions was calculated for a period of 5 years. The newly developed veneering ceramic Eris showed a higher fracture strength (sigma(0)=66.1 MPa) at a failure probability of P(F)=63.2%, and crack growth parameters (n=12.9) compared to the veneering ceramic Empress 2 (sigma(0)=60.3 MPa). For Empress 2 veneer the crack propagation parameter n could only be estimated (n=9.5). This is reflected in the prognosis of long-term resistance presented in the SPT diagrams. For all materials investigated, the Weibull parameter m values (Empress 2 framework m=4.6; Empress 2 veneer m=7.9; Eris m=6.9) were much lower than the minimum demanded by the literature (m=15). The initial fracture strength value alone is not sufficient to characterize the mechanical resistance of ceramic materials, since their stressability is time-dependent. Knowledge about the crack propagation parameters n and A are of great importance when preclinically predicting the clinical suitability of dental ceramic materials. The use of SPT diagrams for lifetime calculation of ceramic materials is a valuable method for comparing different ceramics.
Spur, helical, and spiral bevel transmission life modeling
NASA Technical Reports Server (NTRS)
Savage, Michael; Rubadeux, Kelly L.; Coe, Harold H.; Coy, John J.
1994-01-01
A computer program, TLIFE, which estimates the life, dynamic capacity, and reliability of aircraft transmissions, is presented. The program enables comparisons of transmission service life at the design stage for optimization. A variety of transmissions may be analyzed including: spur, helical, and spiral bevel reductions as well as series combinations of these reductions. The basic spur and helical reductions include: single mesh, compound, and parallel path plus revert star and planetary gear trains. A variety of straddle and overhung bearing configurations on the gear shafts are possible as is the use of a ring gear for the output. The spiral bevel reductions include single and dual input drives with arbitrary shaft angles. The program is written in FORTRAN 77 and has been executed both in the personal computer DOS environment and on UNIX workstations. The analysis may be performed in either the SI metric or the English inch system of units. The reliability and life analysis is based on the two-parameter Weibull distribution lives of the component gears and bearings. The program output file describes the overall transmission and each constituent transmission, its components, and their locations, capacities, and loads. Primary output is the dynamic capacity and 90-percent reliability and mean lives of the unit transmissions and the overall system which can be used to estimate service overhaul frequency requirements. Two examples are presented to illustrate the information available for single element and series transmissions.
Calling patterns in human communication dynamics
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene
2013-01-01
Modern technologies not only provide a variety of communication modes (e.g., texting, cell phone conversation, and online instant messaging), but also detailed electronic traces of these communications between individuals. These electronic traces indicate that the interactions occur in temporal bursts. Here, we study intercall duration of communications of the 100,000 most active cell phone users of a Chinese mobile phone operator. We confirm that the intercall durations follow a power-law distribution with an exponential cutoff at the population level but find differences when focusing on individual users. We apply statistical tests at the individual level and find that the intercall durations follow a power-law distribution for only 3,460 individuals (3.46%). The intercall durations for the majority (73.34%) follow a Weibull distribution. We quantify individual users using three measures: out-degree, percentage of outgoing calls, and communication diversity. We find that the cell phone users with a power-law duration distribution fall into three anomalous clusters: robot-based callers, telecom fraud, and telephone sales. This information is of interest to both academics and practitioners, mobile telecom operators in particular. In contrast, the individual users with a Weibull duration distribution form the fourth cluster of ordinary cell phone users. We also discover more information about the calling patterns of these four clusters (e.g., the probability that a user will call the cr-th most contact and the probability distribution of burst sizes). Our findings may enable a more detailed analysis of the huge body of data contained in the logs of massive users. PMID:23319645
Calling patterns in human communication dynamics.
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H Eugene
2013-01-29
Modern technologies not only provide a variety of communication modes (e.g., texting, cell phone conversation, and online instant messaging), but also detailed electronic traces of these communications between individuals. These electronic traces indicate that the interactions occur in temporal bursts. Here, we study intercall duration of communications of the 100,000 most active cell phone users of a Chinese mobile phone operator. We confirm that the intercall durations follow a power-law distribution with an exponential cutoff at the population level but find differences when focusing on individual users. We apply statistical tests at the individual level and find that the intercall durations follow a power-law distribution for only 3,460 individuals (3.46%). The intercall durations for the majority (73.34%) follow a Weibull distribution. We quantify individual users using three measures: out-degree, percentage of outgoing calls, and communication diversity. We find that the cell phone users with a power-law duration distribution fall into three anomalous clusters: robot-based callers, telecom fraud, and telephone sales. This information is of interest to both academics and practitioners, mobile telecom operators in particular. In contrast, the individual users with a Weibull duration distribution form the fourth cluster of ordinary cell phone users. We also discover more information about the calling patterns of these four clusters (e.g., the probability that a user will call the c(r)-th most contact and the probability distribution of burst sizes). Our findings may enable a more detailed analysis of the huge body of data contained in the logs of massive users.
Modelling of PM10 concentration for industrialized area in Malaysia: A case study in Shah Alam
NASA Astrophysics Data System (ADS)
N, Norazian Mohamed; Abdullah, M. M. A.; Tan, Cheng-yau; Ramli, N. A.; Yahaya, A. S.; Fitri, N. F. M. Y.
In Malaysia, the predominant air pollutants are suspended particulate matter (SPM) and nitrogen dioxide (NO2). This research is on PM10 as they may trigger harm to human health as well as environment. Six distributions, namely Weibull, log-normal, gamma, Rayleigh, Gumbel and Frechet were chosen to model the PM10 observations at the chosen industrial area i.e. Shah Alam. One-year period hourly average data for 2006 and 2007 were used for this research. For parameters estimation, method of maximum likelihood estimation (MLE) was selected. Four performance indicators that are mean absolute error (MAE), root mean squared error (RMSE), coefficient of determination (R2) and prediction accuracy (PA), were applied to determine the goodness-of-fit criteria of the distributions. The best distribution that fits with the PM10 observations in Shah Alamwas found to be log-normal distribution. The probabilities of the exceedences concentration were calculated and the return period for the coming year was predicted from the cumulative density function (cdf) obtained from the best-fit distributions. For the 2006 data, Shah Alam was predicted to exceed 150 μg/m3 for 5.9 days in 2007 with a return period of one occurrence per 62 days. For 2007, the studied area does not exceed the MAAQG of 150 μg/m3
NASA Astrophysics Data System (ADS)
Yu, Jonas C. P.; Lin, Yu-Siang; Wang, Kung-Jeng
2013-09-01
This study develops a model for inventory management consisting of a two-echelon supply chain (SC) with profit sharing and deteriorating items. The retailer and the supplier act as the leader and follower, in which the supplier faces a huge setup cost and economic order quantity ordering strategy. The market demand is affected by the sale price of the product, and the inventory has a deterioration rate following a Weibull distribution. The retailer executes three profit-sharing mechanisms to motivate the supplier to participate in SC optimisation and to extend the life cycle of the product. A search algorithm is developed to determine the solutions as using the profit-sharing mechanisms. The outcomes from numerical experiments demonstrate the profitability of the proposed model.
A probabilistic approach to photovoltaic generator performance prediction
NASA Astrophysics Data System (ADS)
Khallat, M. A.; Rahman, S.
1986-09-01
A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.
A nonparametric stochastic method for generating daily climate-adjusted streamflows
NASA Astrophysics Data System (ADS)
Stagge, J. H.; Moglen, G. E.
2013-10-01
A daily stochastic streamflow generation model is presented, which successfully replicates statistics of the historical streamflow record and can produce climate-adjusted daily time series. A monthly climate model relates general circulation model (GCM)-scale climate indicators to discrete climate-streamflow states, which in turn control parameters in a daily streamflow generation model. Daily flow is generated by a two-state (increasing/decreasing) Markov chain, with rising limb increments randomly sampled from a Weibull distribution and the falling limb modeled as exponential recession. When applied to the Potomac River, a 38,000 km2 basin in the Mid-Atlantic United States, the model reproduces the daily, monthly, and annual distribution and dynamics of the historical streamflow record, including extreme low flows. This method can be used as part of water resources planning, vulnerability, and adaptation studies and offers the advantage of a parsimonious model, requiring only a sufficiently long historical streamflow record and large-scale climate data. Simulation of Potomac streamflows subject to the Special Report on Emissions Scenarios (SRES) A1b, A2, and B1 emission scenarios predict a slight increase in mean annual flows over the next century, with the majority of this increase occurring during the winter and early spring. Conversely, mean summer flows are projected to decrease due to climate change, caused by a shift to shorter, more sporadic rain events. Date of the minimum annual flow is projected to shift 2-5 days earlier by the 2070-2099 period.
Distribution analysis of airborne nicotine concentrations in hospitality facilities.
Schorp, Matthias K; Leyden, Donald E
2002-02-01
A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector.
Thelen, Kirstin; Coboeken, Katrin; Willmann, Stefan; Dressman, Jennifer B; Lippert, Jörg
2012-03-01
The physiological absorption model presented in part I of this work is now extended to account for dosage-form-dependent gastrointestinal (GI) transit as well as disintegration and dissolution processes of various immediate-release and modified-release dosage forms. Empirical functions of the Weibull type were fitted to experimental in vitro dissolution profiles of solid dosage forms for eight test compounds (aciclovir, caffeine, cimetidine, diclofenac, furosemide, paracetamol, phenobarbital, and theophylline). The Weibull functions were then implemented into the model to predict mean plasma concentration-time profiles of the various dosage forms. On the basis of these dissolution functions, pharmacokinetics (PK) of six model drugs was predicted well. In the case of diclofenac, deviations between predicted and observed plasma concentrations were attributable to the large variability in gastric emptying time of the enteric-coated tablets. Likewise, oral PK of furosemide was found to be predominantly governed by the gastric emptying patterns. It is concluded that the revised model for GI transit and absorption was successfully integrated with dissolution functions of the Weibull type, enabling prediction of in vivo PK profiles from in vitro dissolution data. It facilitates a comparative analysis of the parameters contributing to oral drug absorption and is thus a powerful tool for formulation design. Copyright © 2011 Wiley Periodicals, Inc.
The effects of surface finish and grain size on the strength of sintered silicon carbide
NASA Technical Reports Server (NTRS)
You, Y. H.; Kim, Y. W.; Lee, J. G.; Kim, C. H.
1985-01-01
The effects of surface treatment and microstructure, especially abnormal grain growth, on the strength of sintered SiC were studied. The surfaces of sintered SiC were treated with 400, 800 and 1200 grit diamond wheels. Grain growth was induced by increasing the sintering times at 2050 C. The beta to alpha transformation occurred during the sintering of beta-phase starting materials and was often accompanied by abnormal grain growth. The overall strength distributions were established using Weibull statistics. The strength of the sintered SiC is limited by extrinsic surface flaws in normal-sintered specimens. The finer the surface finish and grain size, the higher the strength. But the strength of abnormal sintering specimens is limited by the abnormally grown large tabular grains. The Weibull modulus increases with decreasing grain size and decreasing grit size for grinding.
Testing homogeneity in Weibull-regression models.
Bolfarine, Heleno; Valença, Dione M
2005-10-01
In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.
Performance analysis for mixed FSO/RF Nakagami-m and Exponentiated Weibull dual-hop airborne systems
NASA Astrophysics Data System (ADS)
Jing, Zhao; Shang-hong, Zhao; Wei-hu, Zhao; Ke-fan, Chen
2017-06-01
In this paper, the performances of mixed free-space optical (FSO)/radio frequency (RF) systems are presented based on the decode-and-forward relaying. The Exponentiated Weibull fading channel with pointing error effect is adopted for the atmospheric fluctuation of FSO channel and the RF link undergoes the Nakagami-m fading. We derived the analytical expression for cumulative distribution function (CDF) of equivalent signal-to-noise ratio (SNR). The novel mathematical presentations of outage probability and average bit-error-rate (BER) are developed based on the Meijer's G function. The analytical results show an accurately match to the Monte-Carlo simulation results. The outage and BER performance for the mixed system by decode-and-forward relay are investigated considering atmospheric turbulence and pointing error condition. The effect of aperture averaging is evaluated in all atmospheric turbulence conditions as well.
Wang, Tao; Wu, Jinhui; Qi, Jiancheng; Hao, Limei; Yi, Ying; Zhang, Zongxing
2016-05-15
Bacillus subtilis subsp. niger spore and Staphylococcus albus are typical biological indicators for the inactivation of airborne pathogens. The present study characterized and compared the behaviors of B. subtilis subsp. niger spores and S. albus in regard to inactivation by chlorine dioxide (ClO2) gas under different gas concentrations and relative humidity (RH) conditions. The inactivation kinetics under different ClO2 gas concentrations (1 to 5 mg/liter) were determined by first-order and Weibull models. A new model (the Weibull-H model) was established to reveal the inactivation tendency and kinetics for ClO2 gas under different RH conditions (30 to 90%). The results showed that both the gas concentration and RH were significantly (P < 0.05) and positively correlated with the inactivation of the two chosen indicators. There was a rapid improvement in the inactivation efficiency under high RH (>70%). Compared with the first-order model, the Weibull and Weibull-H models demonstrated a better fit for the experimental data, indicating nonlinear inactivation behaviors of the vegetative bacteria and spores following exposure to ClO2 gas. The times to achieve a six-log reduction of B. subtilis subsp. niger spore and S. albus were calculated based on the established models. Clarifying the kinetics of inactivation of B. subtilis subsp. niger spores and S. albus by ClO2 gas will allow the development of ClO2 gas treatments that provide an effective disinfection method. Chlorine dioxide (ClO2) gas is a novel and effective fumigation agent with strong oxidization ability and a broad biocidal spectrum. The antimicrobial efficacy of ClO2 gas has been evaluated in many previous studies. However, there are presently no published models that can be used to describe the kinetics of inactivation of airborne pathogens by ClO2 gas under different gas concentrations and RH conditions. The first-order and Weibull (Weibull-H) models established in this study can characterize and compare the behaviors of Bacillus subtilis subsp. niger spores and Staphylococcus albus in regard to inactivation by ClO2 gas, determine the kinetics of inactivation of two chosen strains under different conditions of gas concentration and RH, and provide the calculated time to achieve a six-log reduction. These results will be useful to determine effective conditions for ClO2 gas to inactivate airborne pathogens in contaminated air and other environments and thus prevent outbreaks of airborne illness. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Wang, Tao; Wu, Jinhui; Hao, Limei; Yi, Ying; Zhang, Zongxing
2016-01-01
ABSTRACT Bacillus subtilis subsp. niger spore and Staphylococcus albus are typical biological indicators for the inactivation of airborne pathogens. The present study characterized and compared the behaviors of B. subtilis subsp. niger spores and S. albus in regard to inactivation by chlorine dioxide (ClO2) gas under different gas concentrations and relative humidity (RH) conditions. The inactivation kinetics under different ClO2 gas concentrations (1 to 5 mg/liter) were determined by first-order and Weibull models. A new model (the Weibull-H model) was established to reveal the inactivation tendency and kinetics for ClO2 gas under different RH conditions (30 to 90%). The results showed that both the gas concentration and RH were significantly (P < 0.05) and positively correlated with the inactivation of the two chosen indicators. There was a rapid improvement in the inactivation efficiency under high RH (>70%). Compared with the first-order model, the Weibull and Weibull-H models demonstrated a better fit for the experimental data, indicating nonlinear inactivation behaviors of the vegetative bacteria and spores following exposure to ClO2 gas. The times to achieve a six-log reduction of B. subtilis subsp. niger spore and S. albus were calculated based on the established models. Clarifying the kinetics of inactivation of B. subtilis subsp. niger spores and S. albus by ClO2 gas will allow the development of ClO2 gas treatments that provide an effective disinfection method. IMPORTANCE Chlorine dioxide (ClO2) gas is a novel and effective fumigation agent with strong oxidization ability and a broad biocidal spectrum. The antimicrobial efficacy of ClO2 gas has been evaluated in many previous studies. However, there are presently no published models that can be used to describe the kinetics of inactivation of airborne pathogens by ClO2 gas under different gas concentrations and RH conditions. The first-order and Weibull (Weibull-H) models established in this study can characterize and compare the behaviors of Bacillus subtilis subsp. niger spores and Staphylococcus albus in regard to inactivation by ClO2 gas, determine the kinetics of inactivation of two chosen strains under different conditions of gas concentration and RH, and provide the calculated time to achieve a six-log reduction. These results will be useful to determine effective conditions for ClO2 gas to inactivate airborne pathogens in contaminated air and other environments and thus prevent outbreaks of airborne illness. PMID:26969707
Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin
2015-11-01
In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS. Copyright © 2015 Elsevier Ltd. All rights reserved.
Probabilistic thermal-shock strength testing using infrared imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, A.A.; Scheidt, R.A.; Ferber, M.K.
1999-12-01
A thermal-shock strength-testing technique has been developed that uses a high-resolution, high-temperature infrared camera to capture a specimen's surface temperature distribution at fracture. Aluminum nitride (AlN) substrates are thermally shocked to fracture to demonstrate the technique. The surface temperature distribution for each test and AlN's thermal expansion are used as input in a finite-element model to determine the thermal-shock strength for each specimen. An uncensored thermal-shock strength Weibull distribution is then determined. The test and analysis algorithm show promise as a means to characterize thermal shock strength of ceramic materials.
Optimal design and use of retry in fault tolerant real-time computer systems
NASA Technical Reports Server (NTRS)
Lee, Y. H.; Shin, K. G.
1983-01-01
A new method to determin an optimal retry policy and for use in retry of fault characterization is presented. An optimal retry policy for a given fault characteristic, which determines the maximum allowable retry durations to minimize the total task completion time was derived. The combined fault characterization and retry decision, in which the characteristics of fault are estimated simultaneously with the determination of the optimal retry policy were carried out. Two solution approaches were developed, one based on the point estimation and the other on the Bayes sequential decision. The maximum likelihood estimators are used for the first approach, and the backward induction for testing hypotheses in the second approach. Numerical examples in which all the durations associated with faults have monotone hazard functions, e.g., exponential, Weibull and gamma distributions are presented. These are standard distributions commonly used for modeling analysis and faults.
Cai, Jing; Tyree, Melvin T
2010-07-01
The objective of this study was to quantify the relationship between vulnerability to cavitation and vessel diameter within a species. We measured vulnerability curves (VCs: percentage loss hydraulic conductivity versus tension) in aspen stems and measured vessel-size distributions. Measurements were done on seed-grown, 4-month-old aspen (Populus tremuloides Michx) grown in a greenhouse. VCs of stem segments were measured using a centrifuge technique and by a staining technique that allowed a VC to be constructed based on vessel diameter size-classes (D). Vessel-based VCs were also fitted to Weibull cumulative distribution functions (CDF), which provided best-fit values of Weibull CDF constants (c and b) and P(50) = the tension causing 50% loss of hydraulic conductivity. We show that P(50) = 6.166D(-0.3134) (R(2) = 0.995) and that b and 1/c are both linear functions of D with R(2) > 0.95. The results are discussed in terms of models of VCs based on vessel D size-classes and in terms of concepts such as the 'pit area hypothesis' and vessel pathway redundancy.
Drug release from slabs and the effects of surface roughness.
Kalosakas, George; Martini, Dimitra
2015-12-30
We discuss diffusion-controlled drug release from slabs or thin films. Analytical and numerical results are presented for slabs with flat surfaces, having a uniform thickness. Then, considering slabs with rough surfaces, the influence of a non-uniform slab thickness on release kinetics is numerically investigated. The numerical release profiles are obtained using Monte Carlo simulations. Release kinetics is quantified through the stretched exponential (or Weibull) function and the resulting dependence of the two parameters of this function on the thickness of the slab, for flat surfaces, and the amplitude of surface fluctuations (or the degree of thickness variability) in case of roughness. We find that a higher surface roughness leads to a faster drug release. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Shin, Sang Yong; Woo, Kuk Je; Hwang, Byoungchul; Kim, Sangho; Lee, Sunghak
2009-04-01
The fracture toughness in the transition-temperature region of three American Petroleum Institute (API) X70 and X80 pipeline steels was analyzed in accordance with the American Society for Testing and Materials (ASTM) E1921-05 standard test method. The elastic-plastic cleavage fracture toughness ( K Jc ) was determined by three-point bend tests, using precracked Charpy V-notch (PCVN) specimens; the measured K Jc values were then interpreted by the three-parameter Weibull distribution. The fracture-toughness test results indicated that the master curve and the 98 pct confidence curves explained the variation in the measured fracture toughness well. The reference temperatures obtained from the fracture-toughness test and index temperatures obtained from the Charpy impact test were lowest in the X70 steel rolled in the two-phase region, because this steel had smaller effective grains and the lowest volume fraction of hard phases. In this steel, few hard phases led to a higher resistance to cleavage crack initiation, and the smaller effective grain size led to a higher possibility of crack arrest, thereby resulting in the best overall fracture properties. Measured reference temperatures were then comparatively analyzed with the index temperatures obtained from the Charpy impact test, and the effects of microstructures on these temperatures were discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokolov, Mikhail A
2010-01-01
A force-displacement trace of a Charpy impact test of a reactor pressure vessel (RPV) steel in the transition range has a characteristic point, the so-called force at the end of unstable crack propagation , Fa. A two-parameter Weibull probability function is used to model the distribution of the Fa in Charpy tests performed at ORNL on different RPV steels in the unirradiated and irradiated conditions. These data have a good replication at a given test temperature, thus, the statistical analysis was applicable. It is shown that when temperature is normalized to TNDT (T-TNDT) or to T100a (T-T100a), the median Famore » values of different RPV steels have a tendency to form the same shape of temperature dependence. Depending on normalization temperature, TNDT or T100a, it suggests a universal shape of the temperature dependence of Fa for different RPV steels. The best fits for these temperature dependencies are presented. These dependencies are suggested for use in estimation of NDT or T100a from randomly generated Charpy impact tests. The maximum likelihood methods are used to derive equations to estimate TNDT and T100a from randomly generated Charpy impact tests.« less
Bazant, Zdenĕk P; Pang, Sze-Dai
2006-06-20
In mechanical design as well as protection from various natural hazards, one must ensure an extremely low failure probability such as 10(-6). How to achieve that goal is adequately understood only for the limiting cases of brittle or ductile structures. Here we present a theory to do that for the transitional class of quasibrittle structures, having brittle constituents and characterized by nonnegligible size of material inhomogeneities. We show that the probability distribution of strength of the representative volume element of material is governed by the Maxwell-Boltzmann distribution of atomic energies and the stress dependence of activation energy barriers; that it is statistically modeled by a hierarchy of series and parallel couplings; and that it consists of a broad Gaussian core having a grafted far-left power-law tail with zero threshold and amplitude depending on temperature and load duration. With increasing structure size, the Gaussian core shrinks and Weibull tail expands according to the weakest-link model for a finite chain of representative volume elements. The model captures experimentally observed deviations of the strength distribution from Weibull distribution and of the mean strength scaling law from a power law. These deviations can be exploited for verification and calibration. The proposed theory will increase the safety of concrete structures, composite parts of aircraft or ships, microelectronic components, microelectromechanical systems, prosthetic devices, etc. It also will improve protection against hazards such as landslides, avalanches, ice breaks, and rock or soil failures.
NASA Technical Reports Server (NTRS)
Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.
2009-01-01
Composite Overwrapped Pressure Vessel (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Sometimes lifetime testing is performed on an actual COPV in service in an effort to validate the reliability model that is the basis for certifying the continued flight worthiness of its sisters. Currently, testing of such a Kevlar49(registered TradeMark)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the data base and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one nine , that is, reducing the probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several would be necessary.
Multiscale Simulation of Porous Ceramics Based on Movable Cellular Automaton Method
NASA Astrophysics Data System (ADS)
Smolin, A.; Smolin, I.; Eremina, G.; Smolina, I.
2017-10-01
The paper presents a model for simulating mechanical behaviour of multiscale porous ceramics based on movable cellular automaton method, which is a novel particle method in computational mechanics of solid. The initial scale of the proposed approach corresponds to the characteristic size of the smallest pores in the ceramics. At this scale, we model uniaxial compression of several representative samples with an explicit account of pores of the same size but with the random unique position in space. As a result, we get the average values of Young’s modulus and strength, as well as the parameters of the Weibull distribution of these properties at the current scale level. These data allow us to describe the material behaviour at the next scale level were only the larger pores are considered explicitly, while the influence of small pores is included via the effective properties determined at the previous scale level. If the pore size distribution function of the material has N maxima we need to perform computations for N - 1 levels in order to get the properties from the lowest scale up to the macroscale step by step. The proposed approach was applied to modelling zirconia ceramics with bimodal pore size distribution. The obtained results show correct behaviour of the model sample at the macroscale.
Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity
NASA Astrophysics Data System (ADS)
Tanaka, Hiroki; Aizawa, Yoji
2017-02-01
The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.
Trudeau, Michaela P; Verma, Harsha; Sampedro, Fernando; Urriola, Pedro E; Shurson, Gerald C; McKelvey, Jessica; Pillai, Suresh D; Goyal, Sagar M
2016-01-01
Infection with porcine epidemic diarrhea virus (PEDV) causes diarrhea, vomiting, and high mortality in suckling pigs. Contaminated feed has been suggested as a vehicle of transmission for PEDV. The objective of this study was to compare thermal and electron beam processing, and the inclusion of feed additives on the inactivation of PEDV in feed. Feed samples were spiked with PEDV and then heated to 120-145°C for up to 30 min or irradiated at 0-50 kGy. Another set of feed samples spiked with PEDV and mixed with Ultracid P (Nutriad), Activate DA (Novus International), KEM-GEST (Kemin Agrifood), Acid Booster (Agri-Nutrition), sugar or salt was incubated at room temperature (~25°C) for up to 21 days. At the end of incubation, the virus titers were determined by inoculation of Vero-81 cells and the virus inactivation kinetics were modeled using the Weibull distribution model. The Weibull kinetic parameter delta represented the time or eBeam dose required to reduce virus concentration by 1 log. For thermal processing, delta values ranged from 16.52 min at 120°C to 1.30 min at 145°C. For eBeam processing, a target dose of 50 kGy reduced PEDV concentration by 3 log. All additives tested were effective in reducing the survival of PEDV when compared with the control sample (delta = 17.23 days). Activate DA (0.81) and KEM-GEST (3.28) produced the fastest inactivation. In conclusion, heating swine feed at temperatures over 130°C or eBeam processing of feed with a dose over 50 kGy are effective processing steps to reduce PEDV survival. Additionally, the inclusion of selected additives can decrease PEDV survivability.
NASA Astrophysics Data System (ADS)
Neumeister, Jonas M.
1993-08-01
THE TENSILE BEHAVIOR of a brittle matrix composite is studied for post matrix crack saturation conditions. Scatter of fiber strength following the Weibull distribution as well as the influence of the major microstructural variables is considered. The stress in a fiber is assumed to recover linearly around a failure due to a fiber-matrix interface behavior mainly ruled by friction. The constitutive behavior for such a composite is analysed. Results are given for a simplified and a refined approximate description and compared with an analysis resulting from the exact analytical theory of fiber fragmentation. It is shown that the stress-strain relation for the refined model excellently follows the exact solution and gives the location of the maximum to within 1% in both stress and strain; for most materials the agreement is even better. Also it is shown that all relations can be normalized to depend on only two variables; a stress reference and the Weibull exponent. For systems with low scatter in fiber strength the simplified model is sufficient to determine the stress maximum but not the postcritical behavior. In addition, the simplified model gives explicit analytical expressions for the maximum stress and corresponding strain. None of the models contain any volume dependence or statistical scatter, but the maximum stress given by the stress-strain relation constitutes an upper bound for the ultimate tensile strength of the composite.
Comparative Fatigue Lives of Rubber and PVC Wiper Cylindrical Coatings
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.; Savage, Michael
2002-01-01
Three coating materials for rotating cylindrical-coated wiping rollers were fatigue tested in 2 Intaglio printing presses. The coatings were a hard, cross-linked, plasticized PVC thermoset (P-series); a plasticized PVC (A-series); and a hard, nitryl rubber (R-series). Both 2- and 3-parameter Weibull analyses as well as a cost-benefit analysis were performed. The mean value of life for the R-series coating is 24 and 9 times longer than the P- and A-series coatings, respectively. Both the cost and replacement rate for the R-series coating was significantly less than those for the P- and A-series coatings. At a very high probability of survival the R-series coating is approximately 2 and 6 times the lives of the P- and A-series, respectively, before the first failure occurs. Where all coatings are run to failure, using the mean (life) time between removal (MTBR) for each coating to calculate the number of replacements and costs provides qualitatively similar results to those using a Weibull analysis.
Rajan, S; Ahn, J; Balasubramaniam, V M; Yousef, A E
2006-04-01
Bacillus amyloliquefaciens is a potential surrogate for Clostridium botulinum in validation studies involving bacterial spore inactivation by pressure-assisted thermal processing. Spores of B. amyloliquefaciens Fad 82 were inoculated into egg patty mince (approximately 1.4 x 10(8) spores per g), and the product was treated with combinations of pressure (0.1 to 700 MPa) and heat (95 to 121 degrees C) in a custom-made high-pressure kinetic tester. The values for the inactivation kinetic parameter (D), temperature coefficient (zT), and pressure coefficient (zP) were determined with a linear model. Inactivation parameters from the nonlinear Weibull model also were estimated. An increase in process pressure decreased the D-value at 95, 105, and 110 degrees C; however, at 121 degrees C the contribution of pressure to spore lethality was less pronounced. The zP-value increased from 170 MPa at 95 degrees C to 332 MPa at 121 degrees C, suggesting that B. amyloliquefaciens spores became less sensitive to pressure changes at higher temperatures. Similarly, the zT-value increased from 8.2 degrees C at 0.1 MPa to 26.8 degrees C at 700 MPa, indicating that at elevated pressures, the spores were less sensitive to changes in temperature. The nonlinear Weibull model parameter b increased with increasing pressure or temperature and was inversely related to the D-value. Pressure-assisted thermal processing is a potential alternative to thermal processing for producing shelf-stable egg products.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.
Kokolis, John; Chakmakchi, Makdad; Theocharopoulos, Antonios; Prombonas, Anthony
2015-01-01
PURPOSE The mechanical and interfacial characterization of laser welded Co-Cr alloy with two different joint designs. MATERIALS AND METHODS Dumbbell cast specimens (n=30) were divided into 3 groups (R, I, K, n=10). Group R consisted of intact specimens, group I of specimens sectioned with a straight cut, and group K of specimens with a 45° bevel made at the one welding edge. The microstructure and the elemental distributions of alloy and welding regions were examined by an SEM/EDX analysis and then specimens were loaded in tension up to fracture. The tensile strength (TS) and elongation (ε) were determined and statistically compared among groups employing 1-way ANOVA, SNK multiple comparison test (α=.05) and Weibull analysis where Weibull modulus m and characteristic strength σο were identified. Fractured surfaces were imaged by a SEM. RESULTS SEM/EDX analysis showed that cast alloy consists of two phases with differences in mean atomic number contrast, while no mean atomic number was identified for welded regions. EDX analysis revealed an increased Cr and Mo content at the alloy-joint interface. All mechanical properties of group I (TS, ε, m and σο) were found inferior to R while group K showed intermediated values without significant differences to R and I, apart from elongation with group R. The fractured surfaces of all groups showed extensive dendritic pattern although with a finer structure in the case of welded groups. CONCLUSION The K shape joint configuration should be preferred over the I, as it demonstrates improved mechanical strength and survival probability. PMID:25722836
A Compatible Hardware/Software Reliability Prediction Model.
1981-07-22
machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed
A. Broido; Hsiukang Yow
1977-01-01
Even before weight loss in the low-temperature pyrolysis of cellulose becomes significant, the average degree of polymerization of the partially pyrolyzed samples drops sharply. The gel permeation chromatograms of nitrated derivatives of the samples can be described in terms of a small number of mixed size populationsâeach component fitted within reasonable limits by a...
Wear-Out Sensitivity Analysis Project Abstract
NASA Technical Reports Server (NTRS)
Harris, Adam
2015-01-01
During the course of the Summer 2015 internship session, I worked in the Reliability and Maintainability group of the ISS Safety and Mission Assurance department. My project was a statistical analysis of how sensitive ORU's (Orbital Replacement Units) are to a reliability parameter called the wear-out characteristic. The intended goal of this was to determine a worst case scenario of how many spares would be needed if multiple systems started exhibiting wear-out characteristics simultaneously. The goal was also to determine which parts would be most likely to do so. In order to do this, my duties were to take historical data of operational times and failure times of these ORU's and use them to build predictive models of failure using probability distribution functions, mainly the Weibull distribution. Then, I ran Monte Carlo Simulations to see how an entire population of these components would perform. From here, my final duty was to vary the wear-out characteristic from the intrinsic value, to extremely high wear-out values and determine how much the probability of sufficiency of the population would shift. This was done for around 30 different ORU populations on board the ISS.
Modelling explicit fracture of nuclear fuel pellets using peridynamics
NASA Astrophysics Data System (ADS)
Mella, R.; Wenman, M. R.
2015-12-01
Three dimensional models of explicit cracking of nuclear fuel pellets for a variety of power ratings have been explored with peridynamics, a non-local, mesh free, fracture mechanics method. These models were implemented in the explicitly integrated molecular dynamics code LAMMPS, which was modified to include thermal strains in solid bodies. The models of fuel fracture, during initial power transients, are shown to correlate with the mean number of cracks observed on the inner and outer edges of the pellet, by experimental post irradiation examination of fuel, for power ratings of 10 and 15 W g-1 UO2. The models of the pellet show the ability to predict expected features such as the mid-height pellet crack, the correct number of radial cracks and initiation and coalescence of radial cracks. This work presents a modelling alternative to empirical fracture data found in many fuel performance codes and requires just one parameter of fracture strain. Weibull distributions of crack numbers were fitted to both numerical and experimental data using maximum likelihood estimation so that statistical comparison could be made. The findings show P-values of less than 0.5% suggesting an excellent agreement between model and experimental distributions.
Scaling and Multifractality in Road Accidental Distances
NASA Astrophysics Data System (ADS)
Qiu, Tian; Wan, Chi; Zou, Xiang-Xiang; Wang, Xiao-Fan
Accidental distance dynamics is investigated, based on the road accidental data of the Great Britain. The distance distribution of all the districts as an ensemble presents a power law tail, which is different from that of the individual district. A universal distribution is found for different districts, by rescaling the distribution functions of individual districts, which can be well fitted by the Weibull distribution. The male and female drivers behave similarly in the distance distribution. The multifractal characteristic is further studied for the individual district and all the districts as an ensemble, and different behaviors are also revealed between them. The accidental distances of the individual district show a weak multifractality, whereas of all the districts present a strong multifractality when taking them as an ensemble.
NASA Technical Reports Server (NTRS)
Nisbet, John S.; Barnard, Theresa A.; Forbes, Gregory S.; Krider, E. Philip; Lhermitte, Roger
1990-01-01
The data obtained at the time of the Thunderstorm Research International Project storm at the Kennedy Space Center on July 11, 1978 are analyzed in a model-independent manner. The data base included data from three Doppler radars, a lightning detection and ranging system and a network of 25 electric field mills, and rain gages. Electric field measurements were used to analyze the charge moments transferred by lightning flashes, and the data were fitted to Weibull distributions; these were used to estimate statistical parameters of the lightning for both intracloud and cloud-to-ground flashes and to estimate the fraction of the flashes which were below the observation threshold. The displacement and the conduction current densities were calculated from electric field measurements between flashes. These values were used to derive the magnitudes and the locations of dipole and monopole generators by least squares fitting the measured Maxwell current densities to the displacement-dominated equations.
ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model
NASA Technical Reports Server (NTRS)
Hoffman, David J.; Viterna, Larry A.
1991-01-01
A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.
The Effect of Structural Quality on Fatigue Life in 319 Aluminum Alloy Castings
NASA Astrophysics Data System (ADS)
Özdeş, Hüseyin; Tiryakioğlu, Murat
2017-02-01
Tensile and fatigue life data for 319 aluminum alloy from seventeen datasets reported in four independent studies from the literature have been reanalyzed. Analysis of fatigue life data involved mean stress correction for different R ratios used in fatigue testing, inclusion of survival (runout) data along with failure data, as well as volumetric correction for Weibull distributions for different specimen sizes used in these studies. Tensile data have been transformed into the structural quality index, Q T, which is used as a measure of the structural quality of castings. A distinct relationship has been observed between the expected fatigue life and mean quality index. Moreover, fatigue strengths at 104 and 106 cycles have been found increase with quality index, providing further evidence about the relationship observed between structural quality and fatigue performance. Empirical equations between Basquin parameters and structural quality index have been developed. The use of the comprehensive methodology to estimate fatigue life is demonstrated with an example.
Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C
2017-01-01
Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.
Effect of environment on fracture toughness of 96 wt pct alumina
NASA Technical Reports Server (NTRS)
Choi, Sung R.; Tikare, Veena; Salem, Jonathan A.
1993-01-01
An effort is made to deepen understanding of environmental effects on the fracture toughness of an alumina composition that contains a residual glassy phase, by ascertaining the fracture toughness under atmospheric conditions in such varied environments as air distilled water, silicone oil, and liquid nitrogen. Fracture toughness was determined via the single-edge-precracked beam technique. Weibull strength parameters are compared for polished specimens tested both in air and silicone environments.
Regional And Seasonal Aspects Of Within-The-Hour Tec Statistics
NASA Astrophysics Data System (ADS)
Koroglu, Ozan; Arikan, Feza; Koroglu, Meltem
2015-04-01
Ionosphere is one of the atmosphere layers which has a plasma structure. Several mechanisms originating from both space and earth itself governs this plasma layer such as solar radiation and geomagnetic effects. Ionosphere plays important role for HF and satellite communication, and space based positioning systems. Therefore, the determination of statistical behavior of ionosphere has utmost importance. The variability of the ionosphere has complex spatio-temporal characteristics, which depends on solar, geomagnetic, gravitational and seismic activities. Total Electron Content (TEC) is one of the major observables for investigating and determining this variability. In this study, spatio-temporal within-the-hour statistical behavior of TEC is determined for Turkey, which is located in mid-latitude, using the TEC estimates from Turkish National Permanent GPS Network (TNPGN)-Active between the years 2009 and 2012. TEC estimates are obtained as IONOLAB-TEC which is developed by IONOLAB group (www.ionolab.org) from Hacettepe University. IONOLAB-TEC for each station in TNPGN-Active is organized in a database and grouped with respect to years, ionospheric seasons, hours and regions 2 degree by 3 degree, in latitude and longitude, respectively. The data sets are used to calculate within-the-hour parametric Probability Density Functions (PDF). For every year, every region and every hour, a representative PDF is determined. It is observed that TEC values have a strong hourly, seasonal and positional dependence on east-west direction, and the growing trend shifts according to sunrise and sunset times. It is observed that the data are distributed predominantly as Lognormal and Weibull. The averages and standard deviations of the chosen distributions follow the trends in 24 hour diurnal and 11 year solar cycle periods. The regional and seasonal behavior of PDFs are investigated using a representative GPS station within each region. Within-the-hour PDF estimates are grouped into ionospheric seasons as Winter, Summer, March equinox and September equinox. In winter and summer seasons, Lognormal distribution is observed. During equinox seasons, Weibull distribution is observed more frequently. Furthermore, all hourly TEC values in same region are combined in order to improve the reliability and accuracy of the probability density function estimates. It is observed that as being in mid-latitude region, the ionosphere over Turkey has robust characteristics that are distributed as Lognormal and Weibull. Statistical observations on PDF estimates of TEC of the ionosphere over Turkey will contribute to developing a regional and seasonal random field model, which will further contribute to HF channel characterization. This study is supported by a joint grant of TUBITAK 112E568 and RFBR 13-02-91370-CT_a.
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
Virlogeux, Victor; Li, Ming; Tsang, Tim K; Feng, Luzhao; Fang, Vicky J; Jiang, Hui; Wu, Peng; Zheng, Jiandong; Lau, Eric H Y; Cao, Yu; Qin, Ying; Liao, Qiaohong; Yu, Hongjie; Cowling, Benjamin J
2015-10-15
A novel avian influenza virus, influenza A(H7N9), emerged in China in early 2013 and caused severe disease in humans, with infections occurring most frequently after recent exposure to live poultry. The distribution of A(H7N9) incubation periods is of interest to epidemiologists and public health officials, but estimation of the distribution is complicated by interval censoring of exposures. Imputation of the midpoint of intervals was used in some early studies, resulting in estimated mean incubation times of approximately 5 days. In this study, we estimated the incubation period distribution of human influenza A(H7N9) infections using exposure data available for 229 patients with laboratory-confirmed A(H7N9) infection from mainland China. A nonparametric model (Turnbull) and several parametric models accounting for the interval censoring in some exposures were fitted to the data. For the best-fitting parametric model (Weibull), the mean incubation period was 3.4 days (95% confidence interval: 3.0, 3.7) and the variance was 2.9 days; results were very similar for the nonparametric Turnbull estimate. Under the Weibull model, the 95th percentile of the incubation period distribution was 6.5 days (95% confidence interval: 5.9, 7.1). The midpoint approximation for interval-censored exposures led to overestimation of the mean incubation period. Public health observation of potentially exposed persons for 7 days after exposure would be appropriate. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Preparation and Mechanical Behavior of Glass-Ceramics from Feldspathic Frits
NASA Astrophysics Data System (ADS)
da Silva, Fernanda A. N. G.; Barbato, Carla N.; França, Silvia C. A.; Silva, Ana Lúcia N.; de Andrade, Mônica C.
2017-10-01
Glass-ceramics were produced from frits with feldspar (79.09% wt/wt), alumina, sodium carbonate, potassium carbonate, borax and cerium dioxide. Feldspathic frits obtained at 1200 °C were shaped and sintered at various temperatures. Flexural strength results were analyzed by using the Weibull statistical distribution. These materials were also characterized by x-ray diffraction and scanning electron microscopy (SEM). At 600 °C, an initial leucite formation occurred as a crystalline phase, but the amorphous phase still prevailed, with low flexural strength. On the other hand, when the temperature increased to 800 °C, flexural strength also increased to approximately 70 MPa and Weibull modulus, m = 4.4 . This behavior was explained by the formation of leucite crystals dispersed within the glassy matrix, which hinders, in a certain concentration, the propagation of cracks. However, for the sintering temperature of 1000 °C, flexural strength decreased and may be associated with higher levels of this leucite crystal, in spite of the higher reliability m = 6.6.
Pugno, Nicola M
2007-01-01
In this paper we present a statistical analogy between the collapse of solids and living organisms; in particular we deduce a statistical law governing their probability of death. We have derived such a law coupling the widely used Weibull Statistics, developed for describing the distribution of the strength of solids, with a general model for ontogenetic growth recently proposed in literature. The main idea presented in this paper is that cracks can propagate in solids and cause their failure as sick cells in living organisms can cause their death. Making a rough analogy, living organisms are found to behave as "growing" mechanical components under cyclic, i.e., fatigue, loadings and composed by a dynamic evolutionary material that, as an ineluctable fate, deteriorates. The implications on biological scaling laws are discussed. As an example, we apply such a Dynamic Weibull Statistics to large data collections on human deaths due to cancer of various types recorded in Italy: a significant agreement is observed.
Rolling Bearing Life Prediction-Past, Present, and Future
NASA Technical Reports Server (NTRS)
Zaretsky, E V; Poplawski, J. V.; Miller, C. R.
2000-01-01
Comparisons were made between the life prediction formulas of Lundberg and Palmgren, Ioannides and Harris, and Zaretsky and full-scale ball and roller bearing life data. The effect of Weibull slope on bearing life prediction was determined. Life factors are proposed to adjust the respective life formulas to the normalized statistical life distribution of each bearing type. The Lundberg-Palmgren method resulted in the most conservative life predictions compared to Ioannides and Harris, and Zaretsky methods which produced statistically similar results. Roller profile can have significant effects on bearing life prediction results. Roller edge loading can reduce life by as much as 98 percent. The resultant predicted life not only depends on the life equation used but on the Weibull slope assumed, the least variation occurring with the Zaretsky equation. The load-life exponent p of 10/3 used in the American National Standards Institute (ANSI)/American Bearing Manufacturers Association (ABMA)/International Organization for Standardization (ISO) standards is inconsistent with the majority roller bearings designed and used today.
Meteorite fractures and the behavior of meteoroids in the atmosphere
NASA Astrophysics Data System (ADS)
Bryson, K.; Ostrowski, D. R.; Sears, D. W. G.
2015-12-01
Arguably the major difficulty faced to model the atmospheric behavior of objects entering the atmosphere is that we know very little about the internal structure of these objects and their methods of fragmentation during fall. In a study of over a thousand meteorite fragments (mostly hand-sized, some 40 or 50 cm across) in the collections of the Natural History Museums in Vienna and London, we identified six kinds of fracturing behavior. (1) Chondrites usually showed random fractures with no particular sensitivity to meteorite texture. (2) Coarse irons fractured along kamacite grain boundaries, while (3) fine irons fragmented randomly, c.f. chondrites. (4) Fine irons with large crystal boundaries (e.g. Arispe) fragmented along the crystal boundaries. (5) A few chondrites, three in the present study, have a distinct and strong network of fractures making a brickwork or chicken-wire structure. The Chelyabinsk meteorite has the chicken-wire structure of fractures, which explains the very large number of centimeter-sized fragments that showered the Earth. Finally, (6) previous work on Sutter's Mill showed that water-rich meteorites fracture around clasts. To scale the meteorite fractures to the fragmentation behavior of near-Earth asteroids, it has been suggested that the fracturing behavior follows a statistical prediction made in the 1930s, the Weibull distribution, where fractures are assumed to be randomly distributed through the target and the likelihood of encountering a fracture increases with distance. This results in a relationship: σl = σs(ns/nl)α, where σs and σl refers to stress in the small and large object and ns and nl refer to the number of cracks per unit volume of the small and large object. The value for α, the Weibull coefficient, is unclear. Ames meteorite laboratory is working to measure the density and length of fractures observed in these six types of fracture to determine values for the Weibull coefficient for each type of object.
NASA Astrophysics Data System (ADS)
Zhao, Pei; Shao, Ming-an; Horton, Robert
2011-02-01
Soil particle-size distributions (PSD) have been used to estimate soil hydraulic properties. Various parametric PSD models have been proposed to describe the soil PSD from sparse experimental data. It is important to determine which PSD model best represents specific soils. Fourteen PSD models were examined in order to determine the best model for representing the deposited soils adjacent to dams in the China Loess Plateau; these were: Skaggs (S-1, S-2, and S-3), fractal (FR), Jaky (J), Lima and Silva (LS), Morgan (M), Gompertz (G), logarithm (L), exponential (E), log-exponential (LE), Weibull (W), van Genuchten type (VG) as well as Fredlund (F) models. Four-hundred and eighty samples were obtained from soils deposited in the Liudaogou catchment. The coefficient of determination (R 2), the Akaike's information criterion (AIC), and the modified AIC (mAIC) were used. Based upon R 2 and AIC, the three- and four-parameter models were both good at describing the PSDs of deposited soils, and the LE, FR, and E models were the poorest. However, the mAIC in conjunction with R 2 and AIC results indicated that the W model was optimum for describing PSD of the deposited soils for emphasizing the effect of parameter number. This analysis was also helpful for finding out which model is the best one. Our results are applicable to the China Loess Plateau.
A New Femtosecond Laser-Based Three-Dimensional Tomography Technique
NASA Astrophysics Data System (ADS)
Echlin, McLean P.
2011-12-01
Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri-Beam system combines the benefits of laser based material removal (speed, low-damage, automated) with detectors that collect chemical, structural, and topological information. Multi-modal sectioning information was collected after many laser scanning passes demonstrating the capability of the Tri-Beam system.
Stanley J. Zarnoch; Donald P. Feduccia; V. Clark Baldwin; Tommy R. Dell
1991-01-01
A-growth and yield model has been developed for slash pine plantations on problem-free cutover sites in the west gulf region. The model was based on the moment-percentile method using the Weibull distribution for tree diameters. This technique was applied to untbinned and thinned stand projections and, subsequently, to the prediction of residual stands immediately...
Fluctuations in time intervals of financial data from the view point of the Gini index
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi
2007-09-01
We propose an approach to explain fluctuations in time intervals of financial markets data from the view-point of the Gini index. We show the explicit form of the Gini index for a Weibull distribution: A good candidate to describe the first passage time of foreign exchange rate. The analytical expression of the Gini index compares well with the value obtained from empirical data.
Rescaled earthquake recurrence time statistics: application to microrepeaters
NASA Astrophysics Data System (ADS)
Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru
2009-01-01
Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.
Rainfall continuous time stochastic simulation for a wet climate in the Cantabric Coast
NASA Astrophysics Data System (ADS)
Rebole, Juan P.; Lopez, Jose J.; Garcia-Guzman, Adela
2010-05-01
Rain is the result of a series of complex atmospheric processes which are influenced by numerous factors. This complexity makes its simulation practically unfeasible from a physical basis, advising the use of stochastic diagrams. These diagrams, which are based on observed characteristics (Todorovic and Woolhiser, 1975), allow the introduction of renewal alternating processes, that account for the occurrence of rainfall at different time lapses (Markov chains are a particular case, where lapses can be described by exponential distributions). Thus, a sequential rainfall process can be defined as a temporal series in which rainfall events (periods in which rainfall is recorded) alternate with non rain events (periods in which no rainfall is recorded). The variables of a temporal rain sequence have been characterized (duration of the rainfall event, duration of the non rainfall event, average intensity of the rain in the rain event, and a temporal distribution of the amount of rain in the rain event) in a wet climate such as that of the coastal area of Guipúzcoa. The study has been performed from two series recorded at the meteorological stations of Igueldo-San Sebastián and Fuenterrabia / Airport (data every ten minutes and for its hourly aggregation). As a result of this work, the variables satisfactorily fitted the following distribution functions: the duration of the rain event to a exponential function; the duration of the dry event to a truncated exponential mixed distribution; the average intensity to a Weibull distribution; and the distribution of the rain fallen to the Beta distribution. The characterization was made for an hourly aggregation of the recorded interval of ten minutes. The parameters of the fitting functions were better obtained by means of the maximum likelihood method than the moment method. The parameters obtained from the characterization were used to develop a stochastic rainfall process simulation model by means of a three states Markov chain (Hutchinson, 1990), performed in an hourly basis by García-Guzmán (1993) and Castro et al. (1997, 2005 ). Simulation process results were valid in the hourly case for all the four described variables, with a slightly better response in Fuenterrabia than in Igueldo. In summary, all the variables were better simulated in Fuenterrabia than in Igueldo. Fuenterrabia data series is shorter and with longer sequences without missing data, compared to Igueldo. The latter shows higher number of missing data events, whereas its mean duration is longer in Fuenterrabia.
Two-machine flow shop scheduling integrated with preventive maintenance planning
NASA Astrophysics Data System (ADS)
Wang, Shijin; Liu, Ming
2016-02-01
This paper investigates an integrated optimisation problem of production scheduling and preventive maintenance (PM) in a two-machine flow shop with time to failure of each machine subject to a Weibull probability distribution. The objective is to find the optimal job sequence and the optimal PM decisions before each job such that the expected makespan is minimised. To investigate the value of integrated scheduling solution, computational experiments on small-scale problems with different configurations are conducted with total enumeration method, and the results are compared with those of scheduling without maintenance but with machine degradation, and individual job scheduling combined with independent PM planning. Then, for large-scale problems, four genetic algorithm (GA) based heuristics are proposed. The numerical results with several large problem sizes and different configurations indicate the potential benefits of integrated scheduling solution and the results also show that proposed GA-based heuristics are efficient for the integrated problem.
Choice of time-scale in Cox's model analysis of epidemiologic cohort data: a simulation study.
Thiébaut, Anne C M; Bénichou, Jacques
2004-12-30
Cox's regression model is widely used for assessing associations between potential risk factors and disease occurrence in epidemiologic cohort studies. Although age is often a strong determinant of disease risk, authors have frequently used time-on-study instead of age as the time-scale, as for clinical trials. Unless the baseline hazard is an exponential function of age, this approach can yield different estimates of relative hazards than using age as the time-scale, even when age is adjusted for. We performed a simulation study in order to investigate the existence and magnitude of bias for different degrees of association between age and the covariate of interest. Age to disease onset was generated from exponential, Weibull or piecewise Weibull distributions, and both fixed and time-dependent dichotomous covariates were considered. We observed no bias upon using age as the time-scale. Upon using time-on-study, we verified the absence of bias for exponentially distributed age to disease onset. For non-exponential distributions, we found that bias could occur even when the covariate of interest was independent from age. It could be severe in case of substantial association with age, especially with time-dependent covariates. These findings were illustrated on data from a cohort of 84,329 French women followed prospectively for breast cancer occurrence. In view of our results, we strongly recommend not using time-on-study as the time-scale for analysing epidemiologic cohort data. 2004 John Wiley & Sons, Ltd.
Development of a Fault Monitoring Technique for Wind Turbines Using a Hidden Markov Model.
Shin, Sung-Hwan; Kim, SangRyul; Seo, Yun-Ho
2018-06-02
Regular inspection for the maintenance of the wind turbines is difficult because of their remote locations. For this reason, condition monitoring systems (CMSs) are typically installed to monitor their health condition. The purpose of this study is to propose a fault detection algorithm for the mechanical parts of the wind turbine. To this end, long-term vibration data were collected over two years by a CMS installed on a 3 MW wind turbine. The vibration distribution at a specific rotating speed of main shaft is approximated by the Weibull distribution and its cumulative distribution function is utilized for determining the threshold levels that indicate impending failure of mechanical parts. A Hidden Markov model (HMM) is employed to propose the statistical fault detection algorithm in the time domain and the method whereby the input sequence for HMM is extracted is also introduced by considering the threshold levels and the correlation between the signals. Finally, it was demonstrated that the proposed HMM algorithm achieved a greater than 95% detection success rate by using the long-term signals.
Mechanical properties of silicate glasses exposed to a low-Earth orbit
NASA Technical Reports Server (NTRS)
Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.
1992-01-01
The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.
Characterization of intermittency in renewal processes: Application to earthquakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji
2010-03-15
We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less
Bridge element deterioration rates.
DOT National Transportation Integrated Search
2008-10-01
This report describes the development of bridge element deterioration rates using the NYSDOT : bridge inspection database using Markov chains and Weibull-based approaches. It is observed : that Weibull-based approach is more reliable for developing b...
Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions
2015-12-01
spray formed when a fast gas stream blows over a liquid volume.” As a theoretical justification, they showed that Gamma size distributions are...of Fracture, 140, 243, 2006 P.-K. Wu, G. A. Ruff, and G. M. Faeth, Primary Breakup in Liquid - Gas Mixing Layers, Atomization and Sprays, 1, 421-440...103 meter (m) barn (b) 1 × 10–28 square meter (m2) gallon (gal, U.S. liquid ) 3.785 412 × 10–3 cubic meter (m3) cubic foot (ft3) 2.831 685 × 10–2
Joshi, Gaurav V; Duan, Yuanyuan; Della Bona, Alvaro; Hill, Thomas J; St John, Kenneth; Griggs, Jason A
2014-08-01
The objective of this study was to test the following hypotheses: (1) both cyclic degradation and stress-corrosion mechanisms result in subcritical crack growth (SCG) in a fluorapatite glass-ceramic (IPS e.max ZirPress, Ivoclar-Vivadent) and (2) there is an interactive effect of stress corrosion and cyclic fatigue to accelerate subcritical crack growth. Rectangular beam specimens were fabricated using the lost-wax process. Two groups of specimens (N=30/group) with polished (15μm) or air-abraded surface were tested under rapid monotonic loading. Additional polished specimens were subjected to cyclic loading at two frequencies, 2Hz (N=44) and 10Hz (N=36), and at various stress amplitudes. All tests were performed using a fully articulated four-point flexure fixture in deionized water at 37°C. The SCG parameters were determined using the ratio of inert strength Weibull modulus to lifetime Weibull modulus. A general log-linear model was fit to the fatigue lifetime data including time to failure, frequency, peak stress, and the product of frequency and logarithm of stress in ALTA PRO software. SCG parameters determined were n=21.7 and A=4.99×10(-5) for 2Hz, and n=19.1 and A=7.39×10(-6) for 10Hz. After fitting the general log-linear model to cyclic fatigue data, the coefficients of the frequency term (α1), the stress term (α2), and the interaction term (α3) had estimates and 95% confidence intervals of α1=-3.16 (-15.1, 6.30), α2=-21.2 (-34.9, -9.73), and α3=0.820 (-1.59, 4.02). Only α2 was significantly different from zero. (1) Cyclic fatigue does not have a significant effect on SCG in the fluorapatite glass-ceramic evaluated and (2) there was no interactive effect between cyclic degradation and stress corrosion for this material. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Cordell, Jacqueline M; Vogl, Michelle L; Wagoner Johnson, Amy J
2009-10-01
While recognized as a promising bone substitute material, hydroxyapatite (HA) has had limited use in clinical settings because of its inherent brittle behavior. It is well established that macropores ( approximately 100 microm) in a HA implant, or scaffold, are required for bone ingrowth, but recent research has shown that ingrowth is enhanced when scaffolds also contain microporosity. HA is sensitive to synthesis and processing parameters and therefore characterization for specific applications is necessary for transition to the clinic. To that end, the mechanical behavior of bulk microporous HA and HA scaffolds with multi-scale porosity (macropores between rods in the range of 250-350 microm and micropores within the rods with average size of either 5.96 microm or 16.2 microm) was investigated in order to determine how strength and reliability were affected by micropore size (5.96 microm versus 16.2 microm). For the bulk microporous HA, strength increased with decreasing micropore size in both bending (19 MPa to 22 MPa) and compression (71 MPa to 110 MPa). To determine strength reliability, the Weibull moduli for the bulk microporous HA were determined. The Weibull moduli for bending increased (became more reliable) with decreasing pore size (7 to 10) while the Weibull moduli for compression decreased (became less reliable) with decreasing pore size (9 to 6). Furthermore, the elastic properties of the bulk microporous HA (elastic modulus of 30 GPa) and the compressive strengths of the HA scaffolds with multi-scale porosity (8 MPa) did not vary with pore size. The mechanisms responsible for the trends observed were discussed.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, Hun C.; Fang, Ho T.
1987-01-01
The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).
SPOTting model parameters using a ready-made Python package
NASA Astrophysics Data System (ADS)
Houska, Tobias; Kraft, Philipp; Breuer, Lutz
2015-04-01
The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.
A nonlinear model of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Ramli, Norashikin; Muda, Nora; Umor, Mohd Rozi
2014-06-01
Malaysia is a country which is rich in natural resources and one of it is a gold. Gold has already become an important national commodity. This study is conducted to determine a model that can be well fitted with the gold production in Malaysia from the year 1995-2010. Five nonlinear models are presented in this study which are Logistic model, Gompertz, Richard, Weibull and Chapman-Richard model. These model are used to fit the cumulative gold production in Malaysia. The best model is then selected based on the model performance. The performance of the fitted model is measured by sum squares error, root mean squares error, coefficient of determination, mean relative error, mean absolute error and mean absolute percentage error. This study has found that a Weibull model is shown to have significantly outperform compare to the other models. To confirm that Weibull is the best model, the latest data are fitted to the model. Once again, Weibull model gives the lowest readings at all types of measurement error. We can concluded that the future gold production in Malaysia can be predicted according to the Weibull model and this could be important findings for Malaysia to plan their economic activities.
Wei, Shaoceng; Kryscio, Richard J.
2015-01-01
Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient, cognitive states and death as a competing risk (Figure 1). Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript we apply a Semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. PMID:24821001
Wei, Shaoceng; Kryscio, Richard J
2016-12-01
Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient cognitive states and death as a competing risk. Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript, we apply a semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. © The Author(s) 2014.
Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael
2014-09-01
Human noroviruses and hepatitis A virus (HAV) are considered as epidemiologically significant causes of foodborne disease. Therefore, studies are needed to bridge existing data gaps and determine appropriate parameters for thermal inactivation of human noroviruses and HAV. The objectives of this research were to compare the thermal inactivation kinetics of human norovirus surrogates (murine norovirus (MNV-1), and feline calicivirus (FCV-F9)) and HAV in buffered medium (2-ml vials), compare first-order and Weibull models to describe the data, calculate Arrhenius activation energy for each model, and evaluate model efficiency using selected statistical criteria. The D-values calculated from the first-order model (50-72 °C) ranged from 0.21-19.75 min for FCV-F9, 0.25-36.28 min for MNV-1, and 0.88-56.22 min for HAV. Using the Weibull model, the tD = 1 (time to destroy 1 log) for FCV-F9, MNV-1 and HAV at the same temperatures ranged from 0.10-13.27, 0.09-26.78, and 1.03-39.91 min, respectively. The z-values for FCV-F9, MNV-1, and HAV were 9.66 °C, 9.16 °C, and 14.50 °C, respectively, using the Weibull model. For the first order model, z-values were 9.36 °C, 9.32 °C, and 12.49 °C for FCV-F9, MNV-1, and HAV, respectively. For the Weibull model, estimated activation energies for FCV-F9, MNV-1, and HAV were 225, 278, and 182 kJ/mol, respectively, while the calculated activation energies for the first order model were 195, 202, and 171 kJ/mol, respectively. Knowledge of the thermal inactivation kinetics of norovirus surrogates and HAV will allow the development of processes that produce safer food products and improve consumer safety. Copyright © 2014. Published by Elsevier Ltd.
Microstructure characterization and SCG of newly engineered dental ceramics.
Ramos, Nathália de Carvalho; Campos, Tiago Moreira Bastos; Paz, Igor Siqueira de La; Machado, João Paulo Barros; Bottino, Marco Antonio; Cesar, Paulo Francisco; Melo, Renata Marques de
2016-07-01
The aim of this study was to characterize the microstructure of four dental CAD-CAM ceramics and evaluate their susceptibility to stress corrosion. SEM and EDS were performed for microstructural characterization. For evaluation of the pattern of crystallization of the ceramics and the molecular composition, XRD and FTIR, respectively, were used. Elastic modulus, Poisson's ratio, density and fracture toughness were also measured. The specimens were subjected to biaxial flexure under five stress rates (0.006, 0.06, 0.6, 6 and 60MPa/s) to determine the subcritical crack growth parameters (n and D). Twenty-five specimens were further tested in mineral oil for determination of Weibull parameters. Two hundred forty ceramic discs (12mm diameter and 1.2mm thick) were made from four ceramics: feldspathic ceramic - FEL (Vita Mark II, Vita Zahnfabrik), ceramic-infiltrated polymer - PIC (Vita Enamic, Vita Zahnfabrik), lithium disilicate - LD (IPS e.max CAD, Ivoclar Vivadent) and zirconia-reinforced lithium silicate - LS (Vita Suprinity, Vita Zahnfabrik). PIC discs presented organic and inorganic phases (n=29.1±7.7) and Weibull modulus (m) of 8.96. The FEL discs showed n=36.6±6.8 and m=8.02. The LD discs showed a structure with needle-like disilicate grains in a glassy matrix and had the lowest value of n (8.4±0.8) and m=6.19. The ZLS discs showed similar rod-like grains, n=11.2±1.4 and m=9.98. The FEL and PIC discs showed the lowest susceptibility to slow crack growth (SCG), whereas the LD and ZLS discs presented the highest. PIC presented the lowest elastic modulus and no crystals in its composition, while ZLS presented tetragonal zirconia. The overall strength and SCG of the new materials did not benefit from the additional phase or microconstituents present in them. Copyright © 2016 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.
Phoenix, S Leigh; Newman, William I
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/2
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers
NASA Astrophysics Data System (ADS)
Phoenix, S. Leigh; Newman, William I.
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ρ , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, β . Thus the failure rate of a fiber depends on its past load history, except for β=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (β,ρ) pairs that yield contrasting behavior for large N . For ρ>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N→∞ , unlike ELS, which yields a finite limiting mean. For 1/2≤ρ≤1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ρ=1 ) with an asymptotic Gaussian lifetime distribution and a finite limiting mean for large N . The coefficient of variation follows a power law in increasing N but, except for ρ=1 , the value of the negative exponent is clearly less than 1/2 unlike in ELS bundles where the exponent remains 1/2 for 1/2<ρ≤1 . For sufficiently small values 0<ρ≪1 , a transition occurs, depending on β , whereby LLS bundle lifetimes become dominated by a few long-lived fibers. Thus the bundle lifetime appears to approximately follow an extreme-value distribution for the longest lived of a parallel group of independent elements, which applies exactly to ρ=0 . The lower the value of β , the higher the transition value of ρ , below which such extreme-value behavior occurs. No evidence was found for limiting Gaussian behavior for ρ>1 but with 0<β(ρ+1)<1 , as might be conjectured from quasistatic bundle models where β(ρ+1) mimics the Weibull exponent for fiber strength.
NASA Astrophysics Data System (ADS)
Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming
2018-01-01
This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.
The probability distribution model of air pollution index and its dominants in Kuala Lumpur
NASA Astrophysics Data System (ADS)
AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah
2016-11-01
This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.
NASA Technical Reports Server (NTRS)
Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.
2009-01-01
Composite Overwrapped Pressure Vessels (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Flight certification is dependent on the reliability analysis to quantify the risk of stress rupture failure in existing flight vessels. Full certification of this reliability model would require a statistically significant number of lifetime tests to be performed and is impractical given the cost and limited flight hardware for certification testing purposes. One approach to confirm the reliability model is to perform a stress rupture test on a flight COPV. Currently, testing of such a Kevlar49 (Dupont)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the database and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio model is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one "nine," that is, reducing the predicted probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several vessels would be necessary.
Modeling infection of spring onion by Puccinia allii in response to temperature and leaf wetness.
Furuya, Hiromitsu; Takanashi, Hiroyuki; Fuji, Shin-Ichi; Nagai, Yoshio; Naito, Hideki
2009-08-01
The influence of temperature and leaf wetness duration on infection of spring onion (Japanese bunching onion) leaves by Puccinia allii was examined in controlled-environment experiments. Leaves of potted spring onion plants (Allium fistulosum cv. Yoshikura) were inoculated with urediniospores and exposed to 6.5, 10, 15, 22, or 27 h of wetness at 5, 10, 15, 20, or 25 degrees C. The lesion that developed increased in density with increasing wetness duration. Relative infection was modeled as a function of both temperature and wetness duration using the modified version of Weibull's cumulative distribution function (R(2) = 0.9369). Infection occurred between 6.5 and 27 h of leaf wetness duration at 10, 15, 20, and 25 degrees C and between 10 and 27 h at 5 degrees C, and increased rapidly between 6.5 and 15 h of wetness at 10, 15, and 20 degrees C. At 25 degrees C, few uredinia developed regardless of the wetness duration. Parameter H, one of eight parameters used in the equation and which controls the asymmetry in the response curve, varied markedly according to the temperature, so that the model could be improved by representing H as a function of wetness duration (R(2) = 0.9501).
Rolling Bearing Life Prediction, Theory, and Application
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
2013-01-01
A tutorial is presented outlining the evolution, theory, and application of rolling-element bearing life prediction from that of A. Palmgren, 1924; W. Weibull, 1939; G. Lundberg and A. Palmgren, 1947 and 1952; E. Ioannides and T. Harris, 1985; and E. Zaretsky, 1987. Comparisons are made between these life models. The Ioannides-Harris model without a fatigue limit is identical to the Lundberg-Palmgren model. The Weibull model is similar to that of Zaretsky if the exponents are chosen to be identical. Both the load-life and Hertz stress-life relations of Weibull, Lundberg and Palmgren, and Ioannides and Harris reflect a strong dependence on the Weibull slope. The Zaretsky model decouples the dependence of the critical shear stress-life relation from the Weibull slope. This results in a nominal variation of the Hertz stress-life exponent. For 9th- and 8th-power Hertz stress-life exponents for ball and roller bearings, respectively, the Lundberg- Palmgren model best predicts life. However, for 12th- and 10th-power relations reflected by modern bearing steels, the Zaretsky model based on the Weibull equation is superior. Under the range of stresses examined, the use of a fatigue limit would suggest that (for most operating conditions under which a rolling-element bearing will operate) the bearing will not fail from classical rolling-element fatigue. Realistically, this is not the case. The use of a fatigue limit will significantly overpredict life over a range of normal operating Hertz stresses. Since the predicted lives of rolling-element bearings are high, the problem can become one of undersizing a bearing for a particular application.
NASA Astrophysics Data System (ADS)
Zaidman, Paula C.; Morsan, Enrique
2018-05-01
In the development of management measures for sustainable fisheries, estimating the natural mortality rate and recruitment are fundamental. In northern Patagonia, Argentina, the southern geoduck, Panopea abbreviata, a long-lived clam that forms spatially disjunct subpopulations, supports an unregulated fishery. In this study, we estimate natural mortality. We studied the age structure of beds within the northern Patagonia gulfs, San Matías Gulf (SMG) and San Jose Gulf (SJG), and we estimated a time series for back-reconstructed recruitment to explore spatial coherence in relation to local oceanographic conditions and to elucidate its population dynamics. We constructed a cumulative frequency distribution of the age of dead shells collected and used the exponential and Weibull models to model mortality. Live geoducks were sampled from six populations between 2000 and 2006. Age-frequency distributions and mortality models were used to back-calculate the time series of recruitment for each population. The recruitment time series was analysed using continuous wavelet transform. The value of natural mortality estimated by the exponential model was 0.054 years-1, whereas those estimated by the Weibull model were α = 0.00085 years-1 and β = 2.1. For the latter, M values for cohorts were 0.01 for 10 years, 0.02 for 20 years, 0.04 for 30 years and 0.05 for 40 years. The Weibull model was observed to be the best fit to the data. The natural mortality rate of P. abbreviata estimated in this study was lower than that estimated in a previous work for populations from SMG. The back-calculated time series for recruitment demonstrated considerable yearly variation, suggesting that local conditions have an important role in recruitment regulation. At a decadal temporal scale, a clear increasing recruitment trend was evident over the last 20 years in all populations. Populations in SMG were settled >60 years ago. In contrast, no individuals older than 30 years were observed in the populations from SJG. P. abbreviata has several characteristics, such as longevity and low instantaneous natural mortality rate, which require attention in any resource planning. However, this species also has positive characteristics for fishery development, as historical recruitment trends indicate that populations are expanding and are part of a widely distributed metapopulation, suggesting that sustainable exploitation is possible.
NASA Astrophysics Data System (ADS)
Jiang, Yan; Jiang, Jiuchun; Zhang, Caiping; Zhang, Weige; Gao, Yang; Guo, Qipei
2017-08-01
To assess the economic benefits of battery reuse, the consistency and aging characteristics of a retired LiFePO4 battery pack are studied in this paper. The consistency of battery modules is analyzed from the perspective of the capacity and the internal resistance. Test results indicate that battery module parameter dispersion increases along with battery aging. However, battery modules with better capacity consistency doesn't ensure better resistance consistency. Then the aging characteristics of the battery pack are analyzed and the main results are as follow: (1) Weibull and normal distribution are feasible to fit the capacity and resistance distribution of battery modules respectively; (2) SOC imbalance is the dominating factor in the capacity fading process of the battery pack; (3) By employing the incremental capacity (IC) and IC peak area analysis, a consistency evaluation method representing the aging mechanism variations of the battery modules is proposed and then an accurate battery screening strategy is put forward. This study not only provides data support for evaluating economic benefits of retired batteries but also presents a method to recognize the battery aging variations, which is helpful for rapid evaluation and screening of retired batteries for 2nd use.
Incorporating psychological influences in probabilistic cost analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kujawski, Edouard; Alvaro, Mariana; Edwards, William
2004-01-08
Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the scope and magnitude of the cost-overrun problem, the benefits are likely to be significant.« less
On the stability treatment in WAsP
NASA Astrophysics Data System (ADS)
Giebel, G.; Gryning, S.-E.
2003-04-01
An assessment of the treatment of atmospheric stability in the standard package for wind resource estimation, WAsP (from Risø National Laboratory), is presented. Emphasis is on the vertical wind profiles in WAsP and the treatment of stability therein, under special consideration of the nightly situation. The study starts with an introduction to WAsP and the way it treats the vertical extrapolation, under special consideration of the stability. The two parameters available for changing the stability treatment in WAsP are identified as RMS heat flux and offset heat flux. Four years worth of data from the meteorological mast at Risø, plus data from Egypt and Bermuda, is used for the identification of the parameter settings for stable conditions. To this aim, the measured heat fluxes from the mast were used to extract three data sets with successively higher stability in four different heights. These data sets were then run through the Observed Wind Climate Wizard (part of the WAsP package), resulting in Weibull fits to the data. Using these observed wind climates, a prediction of the highest level wind climate using the lowest level wind climate under all different stable conditions is undertaken and compared with the measured data set. To expand on this study, a systematic variation of the two heat flux parameters in WAsP is done, finding the parameters yielding the lowest overall errors for the predictions. Parts of this study were financed by the Landesumweltamt Brandenburg.
Independent Orbiter Assessment (IOA): Weibull analysis report
NASA Technical Reports Server (NTRS)
Raffaelli, Gary G.
1987-01-01
The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.
Cure modeling in real-time prediction: How much does it help?
Ying, Gui-Shuang; Zhang, Qiang; Lan, Yu; Li, Yimei; Heitjan, Daniel F
2017-08-01
Various parametric and nonparametric modeling approaches exist for real-time prediction in time-to-event clinical trials. Recently, Chen (2016 BMC Biomedical Research Methodology 16) proposed a prediction method based on parametric cure-mixture modeling, intending to cover those situations where it appears that a non-negligible fraction of subjects is cured. In this article we apply a Weibull cure-mixture model to create predictions, demonstrating the approach in RTOG 0129, a randomized trial in head-and-neck cancer. We compare the ultimate realized data in RTOG 0129 to interim predictions from a Weibull cure-mixture model, a standard Weibull model without a cure component, and a nonparametric model based on the Bayesian bootstrap. The standard Weibull model predicted that events would occur earlier than the Weibull cure-mixture model, but the difference was unremarkable until late in the trial when evidence for a cure became clear. Nonparametric predictions often gave undefined predictions or infinite prediction intervals, particularly at early stages of the trial. Simulations suggest that cure modeling can yield better-calibrated prediction intervals when there is a cured component, or the appearance of a cured component, but at a substantial cost in the average width of the intervals. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Indrayani, Ervina; Dimara, Lisiard; Paiki, Kalvin; Reba, Felix
2018-01-01
The coastal waters of East Yapen is one of the spawning sites and areas of care for marine biota in Papua. Because of its very open location, it is widely used by human activities such as fishing, residential, industrial and cruise lines. This indirectly affects the balance of coastal waters condition of East Yapen that impact on the existence of…
NASA Astrophysics Data System (ADS)
Okeniyi, Joshua Olusegun; Nwadialo, Christopher Chukwuweike; Olu-Steven, Folusho Emmanuel; Ebinne, Samaru Smart; Coker, Taiwo Ebenezer; Okeniyi, Elizabeth Toyin; Ogbiye, Adebanji Samuel; Durotoye, Taiwo Omowunmi; Badmus, Emmanuel Omotunde Oluwasogo
2017-02-01
This paper investigates C3H7NO2S (Cysteine) effect on the inhibition of reinforcing steel corrosion in concrete immersed in 0.5 M H2SO4, for simulating industrial/microbial environment. Different C3H7NO2S concentrations were admixed, in duplicates, in steel-reinforced concrete samples that were partially immersed in the acidic sulphate environment. Electrochemical monitoring techniques of open circuit potential, as per ASTM C876-91 R99, and corrosion rate, by linear polarization resistance, were then employed for studying anticorrosion effect in steel-reinforced concrete samples by the organic hydrocarbon admixture. Analyses of electrochemical test-data followed ASTM G16-95 R04 prescriptions including probability distribution modeling with significant testing by Kolmogorov-Smirnov and student's t-tests statistics. Results established that all datasets of corrosion potential distributed like the Normal, the Gumbel and the Weibull distributions but that only the Weibull model described all the corrosion rate datasets in the study, as per the Kolmogorov-Smirnov test-statistics. Results of the student's t-test showed that differences of corrosion test-data between duplicated samples with the same C3H7NO2S concentrations were not statistically significant. These results indicated that 0.06878 M C3H7NO2S exhibited optimal inhibition efficiency η = 90.52±1.29% on reinforcing steel corrosion in the concrete samples immersed in 0.5 M H2SO4, simulating industrial/microbial service-environment.
Active structural control of a floating wind turbine with a stroke-limited hybrid mass damper
NASA Astrophysics Data System (ADS)
Hu, Yaqi; He, Erming
2017-12-01
Floating wind turbines are subjected to more severe structural loads than fixed-bottom wind turbines due to additional degrees of freedom (DOFs) of their floating foundations. It's a promising way of using active structural control method to improve the structural responses of floating wind turbines. This paper investigates an active vibration control strategy for a barge-type floating wind turbine by setting a stroke-limited hybrid mass damper (HMD) in the turbine's nacelle. Firstly, a contact nonlinear modeling method for the floating wind turbine with clearance between the HMD and the stroke limiters is presented based on Euler-Lagrange's equations and an active control model of the whole system is established. The structural parameters are validated for the active control model and an equivalent load coefficient method is presented for identifying the wind and wave disturbances. Then, a state-feedback linear quadratic regulator (LQR) controller is designed to reduce vibration and loads of the wind turbine, and two optimization methods are combined to optimize the weighting coefficients when considering the stroke of the HMD and the active control power consumption as constraints. Finally, the designed controllers are implemented in high fidelity simulations under five typical wind and wave conditions. The results show that active HMD control strategy is shown to be achievable and the designed controllers could further reduce more vibration and loads of the wind turbine under the constraints of stroke limitation and power consumption. "V"-shaped distribution of the TMD suppression effect is inconsistent with the Weibull distribution in practical offshore floating wind farms, and the active HMD control could overcome this shortcoming of the passive TMD.
NASA Astrophysics Data System (ADS)
Zarola, Amit; Sil, Arjun
2018-04-01
This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.
Randomized central limit theorems: A unified theory.
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Randomized central limit theorems: A unified theory
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Dynamics of Polydisperse Foam-like Emulsion
NASA Astrophysics Data System (ADS)
Hicock, Harry; Feitosa, Klebert
2011-10-01
Foam is a complex fluid whose relaxation properties are associated with the continuous diffusion of gas from small to large bubbles driven by differences in Laplace pressures. We study the dynamics of bubble rearrangements by tracking droplets of a clear, buoyantly neutral emulsion that coarsens like a foam. The droplets are imaged in three dimensions using confocal microscopy. Analysis of the images allows us to measure their positions and radii, and track their evolution in time. We find that the droplet size distribution fits a Weibull distribution characteristics of foam systems. Additionally, we observe that droplets undergo continuous evolution interspersed by occasional large rearrangements in par with local relaxation behavior typical of foams.
NASA Astrophysics Data System (ADS)
Maghsoudi, Mastoureh; Bakar, Shaiful Anuar Abu
2017-05-01
In this paper, a recent novel approach is applied to estimate the threshold parameter of a composite model. Several composite models from Transformed Gamma and Inverse Transformed Gamma families are constructed based on this approach and their parameters are estimated by the maximum likelihood method. These composite models are fitted to allocated loss adjustment expenses (ALAE). In comparison to all composite models studied, the composite Weibull-Inverse Transformed Gamma model is proved to be a competitor candidate as it best fit the loss data. The final part considers the backtesting method to verify the validation of VaR and CTE risk measures.
Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael
2014-05-01
Hepatitis A virus (HAV) is a food-borne enteric virus responsible for outbreaks of hepatitis associated with shellfish consumption. The objectives of this study were to determine the thermal inactivation behavior of HAV in blue mussels, to compare the first-order and Weibull models to describe the data, to calculate Arrhenius activation energy for each model, and to evaluate model efficiency by using selected statistical criteria. The times required to reduce the population by 1 log cycle (D-values) calculated from the first-order model (50 to 72°C) ranged from 1.07 to 54.17 min for HAV. Using the Weibull model, the times required to destroy 1 log unit (tD = 1) of HAV at the same temperatures were 1.57 to 37.91 min. At 72°C, the treatment times required to achieve a 6-log reduction were 7.49 min for the first-order model and 8.47 min for the Weibull model. The z-values (changes in temperature required for a 90% change in the log D-values) calculated for HAV were 15.88 ± 3.97°C (R(2), 0.94) with the Weibull model and 12.97 ± 0.59°C (R(2), 0.93) with the first-order model. The calculated activation energies for the first-order model and the Weibull model were 165 and 153 kJ/mol, respectively. The results revealed that the Weibull model was more appropriate for representing the thermal inactivation behavior of HAV in blue mussels. Correct understanding of the thermal inactivation behavior of HAV could allow precise determination of the thermal process conditions to prevent food-borne viral outbreaks associated with the consumption of contaminated mussels.
Lepore, Emiliano; Marchioro, Andrea; Isaia, Marco; Buehler, Markus J; Pugno, Nicola M
2012-01-01
Spider silks display generally strong mechanical properties, even if differences between species and within the same species can be observed. While many different types of silks have been tested, the mechanical properties of stalks of silk taken from the egg sac of the cave spider Meta menardi have not yet been analyzed. Meta menardi has recently been chosen as the "European spider of the year 2012", from the European Society of Arachnology. Here we report a study where silk stalks were collected directly from several caves in the north-west of Italy. Field emission scanning electron microscope (FESEM) images showed that stalks are made up of a large number of threads, each of them with diameter of 6.03 ± 0.58 µm. The stalks were strained at the constant rate of 2 mm/min, using a tensile testing machine. The observed maximum stress, strain and toughness modulus, defined as the area under the stress-strain curve, are 0.64 GPa, 751% and 130.7 MJ/m(3), respectively. To the best of our knowledge, such an observed huge elongation has never been reported for egg sac silk stalks and suggests a huge unrolling microscopic mechanism of the macroscopic stalk that, as a continuation of the protective egg sac, is expected to be composed by fibres very densely and randomly packed. The Weibull statistics was used to analyze the results from mechanical testing, and an average value of Weibull modulus (m) is deduced to be in the range of 1.5-1.8 with a Weibull scale parameter (σ(0)) in the range of 0.33-0.41 GPa, showing a high coefficient of correlation (R(2) = 0.97).
Evidence of the Most Stretchable Egg Sac Silk Stalk, of the European Spider of the Year Meta menardi
Lepore, Emiliano; Marchioro, Andrea; Isaia, Marco; Buehler, Markus J.; Pugno, Nicola M.
2012-01-01
Spider silks display generally strong mechanical properties, even if differences between species and within the same species can be observed. While many different types of silks have been tested, the mechanical properties of stalks of silk taken from the egg sac of the cave spider Meta menardi have not yet been analyzed. Meta menardi has recently been chosen as the “European spider of the year 2012”, from the European Society of Arachnology. Here we report a study where silk stalks were collected directly from several caves in the north-west of Italy. Field emission scanning electron microscope (FESEM) images showed that stalks are made up of a large number of threads, each of them with diameter of 6.03±0.58 µm. The stalks were strained at the constant rate of 2 mm/min, using a tensile testing machine. The observed maximum stress, strain and toughness modulus, defined as the area under the stress-strain curve, are 0.64 GPa, 751% and 130.7 MJ/m3, respectively. To the best of our knowledge, such an observed huge elongation has never been reported for egg sac silk stalks and suggests a huge unrolling microscopic mechanism of the macroscopic stalk that, as a continuation of the protective egg sac, is expected to be composed by fibres very densely and randomly packed. The Weibull statistics was used to analyze the results from mechanical testing, and an average value of Weibull modulus (m) is deduced to be in the range of 1.5–1.8 with a Weibull scale parameter (σ 0) in the range of 0.33–0.41 GPa, showing a high coefficient of correlation (R2 = 0.97). PMID:22347380
Weiss, Michael
2017-06-01
Appropriate model selection is important in fitting oral concentration-time data due to the complex character of the absorption process. When IV reference data are available, the problem is the selection of an empirical input function (absorption model). In the present examples a weighted sum of inverse Gaussian density functions (IG) was found most useful. It is shown that alternative models (gamma and Weibull density) are only valid if the input function is log-concave. Furthermore, it is demonstrated for the first time that the sum of IGs model can be also applied to fit oral data directly (without IV data). In the present examples, a weighted sum of two or three IGs was sufficient. From the parameters of this function, the model-independent measures AUC and mean residence time can be calculated. It turned out that a good fit of the data in the terminal phase is essential to avoid parameter biased estimates. The time course of fractional elimination rate and the concept of log-concavity have proved as useful tools in model selection.
NASA Astrophysics Data System (ADS)
Ramesham, Rajeshuni
2010-02-01
Ceramic Column Grid Array packages have been increasing in use based on their advantages such as high interconnect density, very good thermal and electrical performance, compatibility with standard surface-mount packaging assembly processes, etc. CCGA packages are used in space applications such as in logics and microprocessor functions, telecommunications, flight avionics, payload electronics, etc. As these packages tend to have less solder joint strain relief than leaded packages, the reliability of CCGA packages is very important for short and long-term space missions. CCGA interconnect electronic package printed wiring boards (PWBs) of polyimide have been assembled, inspected non-destructively and subsequently subjected to extreme temperature thermal cycling to assess the reliability for future deep space, short and long-term, extreme temperature missions. In this investigation, the employed temperature range covers from -185°C to +125°C extreme thermal environments. The test hardware consists of two CCGA717 packages with each package divided into four daisy-chained sections, for a total of eight daisy chains to be monitored. The CCGA717 package is 33 mm × 33 mm with a 27×27 array of 80%/20% Pb/Sn columns on a 1.27 mm pitch. The resistance of daisy-chained, CCGA interconnects were continuously monitored as a function of thermal cycling. Electrical resistance measurements as a function of thermal cycling are reported and the tests to date have shown significant change in daisy chain resistance as a function of thermal cycling. The change in interconnect resistance becomes more noticeable with increasing number of thermal cycles. This paper will describe the experimental test results of CCGA testing under wide extreme temperatures. Standard Weibull analysis tools were used to extract the Weibull parameters to understand the CCGA failures. Optical inspection results clearly indicate that the solder joints of columns with the board and the ceramic package have failed as a function of thermal cycling. The first failure was observed at 137th thermal cycle and 63.2% failures of daisy chains have occurred at about 664 thermal cycles. The shape parameter extracted from Weibull plot was about 1.47 which indicates the failures were related to failures occurred during the flat region or useful life region of standard bath tub curve. Based on this experimental test data one can use the CCGAs for the temperature range studied for ~100 thermal cycles (ΔT = 310°C, 5oC/minute, and 15 minutes dwell) with high degree of confidence for high reliability space and other applications.
Pre-atmospheric parameters and fragment distribution: Case study for the Kosice meteoroid
NASA Astrophysics Data System (ADS)
Gritsevich, M.; Vinnikov, V.; Kuznetsova, D.; Kohout, T.; Pupyrev, Y.; Peltoniemi, J.; Tóth, J.; Britt, D.; Turchak, L.; Virtanen, J.
2014-07-01
We present results on our investigation on the Košice meteorite --- one of the recent falls with a well-derived trajectory and large number of recovered fragments. A fireball appeared over central-eastern Slovakia on February 28, 2010. The bolide reached an absolute magnitude of at least -18, enabling radiometers of the European Fireball Network to track the fireball despite the cloudy and rainy weather. The landing area was successfully computed on the basis of data from the surveillance cameras operating in Hungary and led to a fast meteorite recovery (Borovička et al. 2013). The first reported fragment of the meteorite was located northwest of the city of Košice in eastern Slovakia (Tóth et al. 2014). 218 fragments of the Košice meteorite, with a total mass of 11.285 kg, have been documented with almost 7 kg belonging to the collection of the Comenius University in Bratislava and Astronomical Institute of Slovak Academy of Sciences (Gritsevich et al. 2014). Based on the statistical investigation of the recovered fragments, bimodal Weibull, bimodal Grady, and bimodal lognormal distributions are found to be the most appropriate distributions for describing the Košice fragmentation process. The most probable scenario suggests that the Košice meteoroid, prior to further extensive fragmentation in the lower atmosphere, was initially represented by two independent pieces with cumulative residual masses of approximately 2 kg and 9 kg respectively (Gritsevich et al. 2014). About 1/3 of the recovered Košice fragments were thoroughly studied, including magnetic susceptibility, bulk and grain density measurements reported by Kohout et al. (2014). This analysis revealed that the Košice meteorites are H5 ordinary chondrites that originated from a homogenous parent meteoroid. To estimate the dynamic mass of the main fragment, we studied the first integral of the drag and mass-loss equations, and the geometrical relation along the meteor trajectory in the atmosphere. By matching these equations to the trajectory data obtained by Borovička et al. (2013), we determine key dimensionless parameters responsible for the meteoroid drag and ablation rate along its visual path in the atmosphere. These parameters allow us to estimate the pre-atmospheric mass, which is in good agreement with the photometric estimate derived by Borovička et al. (2013). Throughout this study, we permit changes in meteoroid shape along the trajectory. Additionally, we estimate the initial shape of the Košice meteoroid based on a statistical analysis (Vinnikov et al. 2014). We also conclude that two to three larger Košice fragments of 500-1000g each should exist, but were either not recovered or not reported by illegal meteorite hunters.
Time-dependent dielectric breakdown of atomic-layer-deposited Al2O3 films on GaN
NASA Astrophysics Data System (ADS)
Hiraiwa, Atsushi; Sasaki, Toshio; Okubo, Satoshi; Horikawa, Kiyotaka; Kawarada, Hiroshi
2018-04-01
Atomic-layer-deposited (ALD) Al2O3 films are the most promising surface passivation and gate insulation layers in non-Si semiconductor devices. Here, we carried out an extensive study on the time-dependent dielectric breakdown characteristics of ALD-Al2O3 films formed on homo-epitaxial GaN substrates using two different oxidants at two different ALD temperatures. The breakdown times were approximated by Weibull distributions with average shape parameters of 8 or larger. These values are reasonably consistent with percolation theory predictions and are sufficiently large to neglect the wear-out lifetime distribution in assessing the long-term reliability of the Al2O3 films. The 63% lifetime of the Al2O3 films increases exponentially with a decreasing field, as observed in thermally grown SiO2 films at low fields. This exponential relationship disproves the correlation between the lifetime and the leakage current. Additionally, the lifetime decreases with measurement temperature with the most remarkable reduction observed in high-temperature (450 °C) O3-grown films. This result agrees with that from a previous study, thereby ruling out high-temperature O3 ALD as a gate insulation process. When compared at 200 °C under an equivalent SiO2 field of 4 MV/cm, which is a design guideline for thermal SiO2 on Si, high-temperature H2O-grown Al2O3 films have the longest lifetimes, uniquely achieving the reliability target of 20 years. However, this target is accomplished by a relatively narrow margin and, therefore, improvements in the lifetime are expected to be made, along with efforts to decrease the density of extrinsic Al2O3 defects, if any, to promote the practical use of ALD Al2O3 films.
NASA Technical Reports Server (NTRS)
Bansal, Narottam P.; Chen, Yuan L.
1998-01-01
Room temperature tensile strengths of as-received Hi-Nicalon fibers and those having BN/SiC, p-BN/SiC, and p-B(Si)N/SiC surface coatings, deposited by chemical vapor deposition, were measured using an average fiber diameter of 13.5 microns. The Weibull statistical parameters were determined for each fiber. The average tensile strength of uncoated Hi-Nicalon on was 3.19 +/- 0.73 GPa with a Weibull modulus of 5.41. Strength of fibers coated with BN/SiC did not change. However, coat with p-BN/SiC and p-B(Si)N/SiC surface layers showed strength loss of approx. 10 and 35 percent, respectively, compared with as-received fibers. The elemental compositions of the fibers and the coatings were analyzed using scanning Auger microprobe and energy dispersive x-ray spectroscopy. The BN coating was contaminated with a large concentration of carbon and some oxygen. In contrast, p-BN, p-B(Si)N, and SiC coatings did not show any contamination. Microstructural analyses of the fibers and the coatings were done by scanning electron microscopy (SEM), transmission electron microscopy (TEM), and selected area electron diffraction. Hi-Nicalon fiber consists of the P-SIC nanocrystals ranging in size from 1 to 30 nm embedded in an amorphous matrix. TEM analysis of the BN coating revealed four distinct layers with turbostatic structure. The p-BN layer was turbostratic and showed considerable preferred orientation. The p-B(Si)N was glassy and the silicon and boron were uniformly distributed. The silicon carbide coating was polycrystalline with a columnar structure along the growth direction. The p-B(Si)N/SiC coatings were more uniform, less defective and of better quality than the BN/SiC or the p-BN/SiC coatings.
Trudeau, Michaela P.; Verma, Harsha; Sampedro, Fernando; Urriola, Pedro E.; Shurson, Gerald C.; McKelvey, Jessica; Pillai, Suresh D.; Goyal, Sagar M.
2016-01-01
Infection with porcine epidemic diarrhea virus (PEDV) causes diarrhea, vomiting, and high mortality in suckling pigs. Contaminated feed has been suggested as a vehicle of transmission for PEDV. The objective of this study was to compare thermal and electron beam processing, and the inclusion of feed additives on the inactivation of PEDV in feed. Feed samples were spiked with PEDV and then heated to 120–145°C for up to 30 min or irradiated at 0–50 kGy. Another set of feed samples spiked with PEDV and mixed with Ultracid P (Nutriad), Activate DA (Novus International), KEM-GEST (Kemin Agrifood), Acid Booster (Agri-Nutrition), sugar or salt was incubated at room temperature (~25°C) for up to 21 days. At the end of incubation, the virus titers were determined by inoculation of Vero-81 cells and the virus inactivation kinetics were modeled using the Weibull distribution model. The Weibull kinetic parameter delta represented the time or eBeam dose required to reduce virus concentration by 1 log. For thermal processing, delta values ranged from 16.52 min at 120°C to 1.30 min at 145°C. For eBeam processing, a target dose of 50 kGy reduced PEDV concentration by 3 log. All additives tested were effective in reducing the survival of PEDV when compared with the control sample (delta = 17.23 days). Activate DA (0.81) and KEM-GEST (3.28) produced the fastest inactivation. In conclusion, heating swine feed at temperatures over 130°C or eBeam processing of feed with a dose over 50 kGy are effective processing steps to reduce PEDV survival. Additionally, the inclusion of selected additives can decrease PEDV survivability. PMID:27341670
The end of trend-estimation for extreme floods under climate change?
NASA Astrophysics Data System (ADS)
Schulz, Karsten; Bernhardt, Matthias
2016-04-01
An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.
Fabrication Quality Analysis of a Fiber Optic Refractive Index Sensor Created by CO2 Laser Machining
Chen, Chien-Hsing; Yeh, Bo-Kuan; Tang, Jaw-Luen; Wu, Wei-Te
2013-01-01
This study investigates the CO2 laser-stripped partial cladding of silica-based optic fibers with a core diameter of 400 μm, which enables them to sense the refractive index of the surrounding environment. However, inappropriate treatments during the machining process can generate a number of defects in the optic fiber sensors. Therefore, the quality of optic fiber sensors fabricated using CO2 laser machining must be analyzed. The results show that analysis of the fiber core size after machining can provide preliminary defect detection, and qualitative analysis of the optical transmission defects can be used to identify imperfections that are difficult to observe through size analysis. To more precisely and quantitatively detect fabrication defects, we included a tensile test and numerical aperture measurements in this study. After a series of quality inspections, we proposed improvements to the existing CO2 laser machining parameters, namely, a vertical scanning pathway, 4 W of power, and a feed rate of 9.45 cm/s. Using these improved parameters, we created optical fiber sensors with a core diameter of approximately 400 μm, no obvious optical transmission defects, a numerical aperture of 0.52 ± 0.019, a 0.886 Weibull modulus, and a 1.186 Weibull-shaped parameter. Finally, we used the optical fiber sensor fabricated using the improved parameters to measure the refractive indices of various solutions. The results show that a refractive-index resolution of 1.8 × 10−4 RIU (linear fitting R2 = 0.954) was achieved for sucrose solutions with refractive indices ranging between 1.333 and 1.383. We also adopted the particle plasmon resonance sensing scheme using the fabricated optical fibers. The results provided additional information, specifically, a superior sensor resolution of 5.73 × 10−5 RIU, and greater linearity at R2 = 0.999. PMID:23535636
Accelerated Thermal Cycling and Failure Mechanisms for BGA and CSP Assemblies
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza
2000-01-01
This paper reviews the accelerated thermal cycling test methods that are currently used by industry to characterize the interconnect reliability of commercial-off-the-shelf (COTS) ball grid array (BGA) and chip scale package (CSP) assemblies. Acceleration induced failure mechanisms varied from conventional surface mount (SM) failures for CSPs. Examples of unrealistic life projections for other CSPs are also presented. The cumulative cycles to failure for ceramic BGA assemblies performed under different conditions, including plots of their two Weibull parameters, are presented. The results are for cycles in the range of -30 C to 100 C, -55 C to 100 C, and -55 C to 125 C. Failure mechanisms as well as cycles to failure for thermal shock and thermal cycling conditions in the range of -55 C to 125 C were compared. Projection to other temperature cycling ranges using a modified Coffin-Manson relationship is also presented.
NASA Astrophysics Data System (ADS)
Castaño Moraga, C. A.; Suárez Santana, E.; Sabbagh Rodríguez, I.; Nebot Medina, R.; Suárez García, S.; Rodríguez Alvarado, J.; Piernavieja Izquierdo, G.; Ruiz Alzola, J.
2010-09-01
Wind farms authorization and power allocations to private investors promoting wind energy projects requires some planification strategies. This issue is even more important under land restrictions, as it is the case of Canary Islands, where numerous specially protected areas are present for environmental reasons and land is a scarce resource. Aware of this limitation, the Regional Government of Canary Islands designed the requirements of a public tender to grant licences to install new wind farms trying to maximize the energy produced in terms of occupied land. In this paper, we detail the methodology developed by the Canary Islands Institute of Technology (ITC, S.A.) to support the work of the technical staff of the Regional Ministry of Industry, responsible for the evaluation of a competitive tender process for awarding power lincenses to private investors. The maximization of wind energy production per unit of area requires an exhaustive wind profile characterization. To that end, wind speed was statistically characterized by means of a Weibull probability density function, which mainly depends on two parameters: the shape parameter K, which determines the slope of the curve, and the average wind speed v , which is a scale parameter. These two parameters have been evaluated at three different heights (40,60,80 m) over the whole canarian archipelago, as well as the main wind speed direction. These parameters are available from the public data source Wind Energy Map of the Canary Islands [1]. The proposed methodology is based on the calculation of an initially defined Energy Efficiency Basic Index (EEBI), which is a performance criteria that weighs the annual energy production of a wind farm per unit of area. The calculation of this parameter considers wind conditions, windturbine characteristics, geometry of windturbine distribution in the wind farm (position within the row and column of machines), and involves four steps: Estimation of the energy produced by every windturbine as if it were isolated from all the other machines of the wind farm, using its power curve and the statistical characterization of the wind profile at the site. Estimation of energy losses due to affections caused by other windturbine in the same row and missalignment with respect to the main wind speed direction. Estimation of energy losses due to affections induced by windturbines located upstream. EEBI calculation as the ratio between the annual energy production and the area occupied by the wind farm, as a function of wind speed profile and wind turbine characteristics. Computations involved above are modeled under a System Theory characterization
The flexural properties of endodontic post materials.
Stewardson, Dominic A; Shortall, Adrian C; Marquis, Peter M; Lumley, Philip J
2010-08-01
To measure the flexural strengths and moduli of endodontic post materials and to assess the effect on the calculated flexural properties of varying the diameter/length (D/L) ratio of three-point bend test samples. Three-point bend testing of samples of 2mm diameter metal and fiber-reinforced composite (FRC) rods was carried out and the mechanical properties calculated at support widths of 16 mm, 32 mm and 64 mm. Weibull analysis was performed on the strength data. The flexural strengths of all the FRC post materials exceeded the yield strengths of the gold and stainless steel samples; the flexural strengths of two FRC materials were comparable with the yield strength of titanium. Stainless steel recorded the highest flexural modulus while the titanium and the two carbon fiber materials exhibited similar values just exceeding that of gold. The remaining glass fiber materials were of lower modulus within the range of 41-57 GPa. Weibull modulus values for the FRC materials ranged from 16.77 to 30.09. Decreasing the L/D ratio produced a marked decrease in flexural modulus for all materials. The flexural strengths of FRC endodontic post materials as new generally exceed the yield strengths of metals from which endodontic posts are made. The high Weibull modulus values suggest good clinical reliability of FRC posts. The flexural modulus values of the tested posts were from 2-6 times (FRC) to 4-10 times (metal) that of dentin. Valid measurement of flexural properties of endodontic post materials requires that test samples have appropriate L/D ratios. Copyright 2010 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
EXPERIMENTAL DESIGN STRATEGY FOR THE WEIBULL DOSE RESPONSE MODEL (JOURNAL VERSION)
The objective of the research was to determine optimum design point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose response relationship follows the Weibull. The optimum design is dependent on the values of the We...
[Properties and infiltration arts of machinable infiltration ceramic(MIC)].
Yang, H; Xian, S; Liao, Y; Xue, Y; Chai, F
2000-06-01
The purpose of this study is to explore the infiltration arts of MIC and study the effects of different packing density of Al2O3 matrix on the properties of MIC. alpha-Al2O3 specimens were fabricated by pouring alpha-Al2O3 slip with different powder/liquid ratios(P/L = 3.5, 7.5, 10.5) into a mold, and subsequently pre-fired at 1160 degrees C for 6 hours to form Al2O3 matrix. The packing density of the matrices were measured. Infiltration concepts were introduced into this study by infiltrating molten mica micro-crystalline glass into the porous Al2O3 matrix at 1160 degrees C for 6 hours to form a continuous interpenetrating composite. The composite then underwent micro-crystallization by nucleating at 550 degrees C for 1 hour and crystallizing at 900 degrees C for 1 hour, which resulted in the MIC. Mechanical properties including three point flexural strength, elastic modulus, Vicker's hardness, indentation fracture toughness and Weibull's modulus of flexural strength were determined. Parameters of machinability(H/KIC)2 of MIC were calculated. XRD and SEM were employed to study its microstructure. The resulted matrices reached packing densities of 63%, 76%, 78% with P/L of 3.5, 7.5 and 10.5. The MIC attained high strength and good machinability after infiltration. Three-point flexural strength and indentation fracture toughness were 342, 431, 374 MPa and 4.05, 4.14, 5.02 MPa m1/2 for MIC with packing density of 63%, 76%, 78% separately. And parameters of machinability were 5.41, 6.84 and 7.39 respectively. Packing density of Al2O3 matrix significantly influenced the mechanical properties. Maximum properties were obtained with a matrix packing density of 75%(P/L = 7.5), with a Weibull's modulus of flexural strength of 6.8. Machinability decreased with the increase of P/L ratio. Micro-crystallizing treatment resulted in the formation of evenly distributed mica crystalline in the composite, which contributed to the high strength of this composite material. MIC is a new infiltrated ceramic with favorable strength and machinability which can satisfy the prosthodontic requirements as all ceramic crown and bridge materials, it also shows promising outlook for future developments and clinical usage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji Hyun, Yoon; Byun, Thak Sang; Strizak, Joe P
2011-01-01
The mechanical properties of NBG-18 nuclear grade graphite have been characterized using small specimen test techniques and statistical treatment on the test results. New fracture strength and toughness test techniques were developed to use subsize cylindrical specimens with glued heads and to reuse their broken halves. Three sets of subsize cylindrical specimens with the different diameters of 4 mm, 8 mm, and 12 mm were tested to obtain tensile fracture strength. The longer piece of the broken halves was cracked from side surfaces and tested under three-point bend loading to obtain fracture toughness. Both the strength and fracture toughness datamore » were analyzed using Weibull distribution models focusing on size effect. The mean fracture strength decreased from 22.9 MPa to 21.5 MPa as the diameter increased from 4 mm to 12 mm, and the mean strength of 15.9 mm diameter standard specimen, 20.9 MPa, was on the extended trend line. These fracture strength data indicate that in the given diameter range the size effect is not significant and much smaller than that predicted by the Weibull statistics-based model. Further, no noticeable size effect existed in the fracture toughness data, whose mean values were in a narrow range of 1.21 1.26 MPa. The Weibull moduli measured for fracture strength and fracture toughness datasets were around 10. It is therefore believed that the small or negligible size effect enables to use the subsize specimens and that the new fracture toughness test method to reuse the broken specimens to help minimize irradiation space and radioactive waste.« less
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.
1983-01-01
Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.H.
1980-01-01
Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Photon statistics of a two-mode squeezed vacuum
NASA Technical Reports Server (NTRS)
Schrade, Guenter; Akulin, V. M.; Schleich, W. P.; Manko, Vladimir I.
1994-01-01
We investigate the general case of the photon distribution of a two-mode squeezed vacuum and show that the distribution of photons among the two modes depends on four parameters: two squeezing parameters, the relative phase between the two oscillators and their spatial orientation. The distribution of the total number of photons depends only on the two squeezing parameters. We derive analytical expressions and present pictures for both distributions.
NASA Astrophysics Data System (ADS)
Pathak, Savita; Mondal, Seema Sarkar
2010-10-01
A multi-objective inventory model of deteriorating item has been developed with Weibull rate of decay, time dependent demand, demand dependent production, time varying holding cost allowing shortages in fuzzy environments for non- integrated and integrated businesses. Here objective is to maximize the profit from different deteriorating items with space constraint. The impreciseness of inventory parameters and goals for non-integrated business has been expressed by linear membership functions. The compromised solutions are obtained by different fuzzy optimization methods. To incorporate the relative importance of the objectives, the different cardinal weights crisp/fuzzy have been assigned. The models are illustrated with numerical examples and results of models with crisp/fuzzy weights are compared. The result for the model assuming them to be integrated business is obtained by using Generalized Reduced Gradient Method (GRG). The fuzzy integrated model with imprecise inventory cost is formulated to optimize the possibility necessity measure of fuzzy goal of the objective function by using credibility measure of fuzzy event by taking fuzzy expectation. The results of crisp/fuzzy integrated model are illustrated with numerical examples and results are compared.
Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erin V.
2007-01-01
A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for a 2-bearing shaft assembly in each body flap actuator established a reliability level of 99.6 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.