Science.gov

Sample records for weibull distribution analysis

  1. /q-exponential, Weibull, and /q-Weibull distributions: an empirical analysis

    NASA Astrophysics Data System (ADS)

    Picoli, S.; Mendes, R. S.; Malacarne, L. C.

    2003-06-01

    In a comparative study, the q-exponential and Weibull distributions are employed to investigate frequency distributions of basketball baskets, cyclone victims, brand-name drugs by retail sales, and highway length. In order to analyze the intermediate cases, a distribution, the q-Weibull one, which interpolates the q-exponential and Weibull ones, is introduced. It is verified that the basketball baskets distribution is well described by a q-exponential, whereas the cyclone victims and brand-name drugs by retail sales ones are better adjusted by a Weibull distribution. On the other hand, for highway length the q-exponential and Weibull distributions do not give satisfactory adjustment, being necessary to employ the q-Weibull distribution. Furthermore, the introduction of this interpolating distribution gives an illumination from the point of view of the stretched exponential against inverse power law ( q-exponential with q>1) controversy.

  2. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  3. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-parameter Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  4. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  5. Statistical analysis of censored motion sickness latency data using the two-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Park, Won J.; Crampton, George H.

    1988-01-01

    The suitability of the two-parameter Weibull distribution for describing highly censored cat motion sickness latency data was evaluated by estimating the parameters with the maximum likelihood method and testing for goodness of fit with the Kolmogorov-Smirnov statistic. A procedure for determining confidence levels and testing for significance of the difference between Weibull parameters is described. Computer programs for these procedures may be obtained from an archival source.

  6. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  7. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  8. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  9. Packing fraction of particles with a Weibull size distribution

    NASA Astrophysics Data System (ADS)

    Brouwers, H. J. H.

    2016-07-01

    This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ1, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1 - φ1)β as function of φ1 is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data.

  10. Packing fraction of particles with a Weibull size distribution.

    PubMed

    Brouwers, H J H

    2016-07-01

    This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ_{1}, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1-φ_{1})β as function of φ_{1} is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data. PMID:27575204

  11. Independent Orbiter Assessment (IOA): Weibull analysis report

    NASA Technical Reports Server (NTRS)

    Raffaelli, Gary G.

    1987-01-01

    The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.

  12. Weibull model of multiplicity distribution in hadron-hadron collisions

    NASA Astrophysics Data System (ADS)

    Dash, Sadhana; Nandi, Basanta K.; Sett, Priyanka

    2016-06-01

    We introduce the use of the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes that involve fragmentation processes. This provides a natural connection to the available state-of-the-art models for multiparticle production in hadron-hadron collisions, which involve QCD parton fragmentation and hadronization. The Weibull distribution describes the multiplicity data at the most recent LHC energies better than the single negative binomial distribution.

  13. Program for Weibull Analysis of Fatigue Data

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2005-01-01

    A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

  14. Modeling observed animal performance using the Weibull distribution.

    PubMed

    Hagey, Travis J; Puthoff, Jonathan B; Crandell, Kristen E; Autumn, Kellar; Harmon, Luke J

    2016-06-01

    To understand how organisms adapt, researchers must link performance and microhabitat. However, measuring performance, especially maximum performance, can sometimes be difficult. Here, we describe an improvement over previous techniques that only consider the largest observed values as maxima. Instead, we model expected performance observations via the Weibull distribution, a statistical approach that reduces the impact of rare observations. After calculating group-level weighted averages and variances by treating individuals separately to reduce pseudoreplication, our approach resulted in high statistical power despite small sample sizes. We fitted lizard adhesive performance and bite force data to the Weibull distribution and found that it closely estimated maximum performance in both cases, illustrating the generality of our approach. Using the Weibull distribution to estimate observed performance greatly improves upon previous techniques by facilitating power analyses and error estimations around robustly estimated maximum values. PMID:26994180

  15. Development of a Weibull posterior distribution by combining a Weibull prior with an actual failure distribution using Bayesian inference

    NASA Technical Reports Server (NTRS)

    Giuntini, Michael E.; Giuntini, Ronald E.

    1991-01-01

    A Bayesian inference process for system logistical planning is presented which provides a method for incorporating actual failures with prediction data for an ongoing and improving reliability estimates. The process uses the Weibull distribution, and provides a means for examining and updating logistical and maintenance support needs.

  16. The Weibull distribution applied to post and core failure.

    PubMed

    Huysmans, M C; Van Der Varst, P G; Peters, M C; Plasschaert, A J

    1992-07-01

    In this study, data on initial failure loads of direct post and core-restored premolar teeth were analyzed using the Weibull distribution. Restorations consisted of a prefabricated titanium alloy post, and an amalgam, composite or glass cermet core buildup in human upper premolar teeth. The specimens were subjected to compressive forces until failure at angles of 10, 45 and 90 degrees to their long axis. The two- and three-parameter Weibull distributions were compared for applicability to the failure load data. For estimation of the parameters of the two-parameter distribution: sigma 0 (reference stress) and m (Weibull modulus), linear regression was used. In this distribution, it is assumed that the third parameter, sigma u (cut-off stress), equals 0. The Maximum Likelihood (MLH) method was used to estimate all three parameters. It was found that the choice of distribution has a strong influence on the estimated values and that the three-parameter distribution is best fitted for the failure loads in this study. Comparisons were made between the failure probability curves as found by MLH estimation for the different core materials and loading angles. The results indicated that the influence of loading angle on the failure mechanism was stronger than that of core material. PMID:1291399

  17. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  18. A new generalization of Weibull distribution with application to a breast cancer data set

    PubMed Central

    Wahed, Abdus S.; Luong, The Minh; Jeong, Jong-Hyeon

    2011-01-01

    SUMMARY In this article, we propose a new generalization of the Weibull distribution, which incorporates the exponentiated Weibull distribution introduced by Mudholkar and Srivastava [1] as a special case. We refer to the new family of distributions as the beta-Weibull distribution. We investigate the potential usefulness of the beta-Weibull distribution for modeling censored survival data from biomedical studies. Several other generalizations of the standard two-parameter Weibull distribution are compared with regards to maximum likelihood inference of the cumulative incidence function, under the setting of competing risks. These Weibull-based parametric models are fit to a breast cancer dataset from the National Surgical Adjuvant Breast and Bowel Project (NSABP). In terms of statistical significance of the treatment effect and model adequacy, all generalized models lead to similar conclusions, suggesting that the beta-Weibull family is a reasonable candidate for modeling survival data. PMID:19424958

  19. Tensile strength of randomly perforated aluminum plates: Weibull distribution parameters

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2008-07-01

    Recently, Yanay and collaborators [J. Appl. Phys. 101, 104911 (2007)] addressed issues regarding the fracture strength of randomly perforated aluminum plates subjected to tensile loads. Based on comprehensive measurements and computational simulations, they formulate statistical predictions for the tensile strength dependence on the hole density but conclude that their data are inadequate for the purpose of deriving the strength distribution function. The primary purpose of this contribution is to demonstrate that, on dividing the totality of applicable data into seven "bins" of comparable population, the strength distribution of perforated plates of similar hole density obeys a conventional two-parameter Weibull model. Furthermore, on examining the fracture stresses as recorded in the vicinity of the percolation threshold, we find that the strength obeys the expression σo(P -Pth)β with Pth≃0.64 and β ≃0.4. In this light, and taking advantage of percolation theory, we formulate equations that specify how the two Weibull parameters (characteristic strength and shape factor) depend on the hole density. This enables us to express the failure probability as a function of the tensile stress, over the entire range of hole densities, i.e., P =0.02 up to the percolation threshold.

  20. Kinetic Analysis of Isothermal Decomposition Process of Sodium Bicarbonate Using the Weibull Probability Function—Estimation of Density Distribution Functions of the Apparent Activation Energies

    NASA Astrophysics Data System (ADS)

    Janković, Bojan

    2009-10-01

    The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.

  1. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  2. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  3. A comparison of the generalized gamma and exponentiated Weibull distributions.

    PubMed

    Cox, Christopher; Matheson, Matthew

    2014-09-20

    This paper provides a comparison of the three-parameter exponentiated Weibull (EW) and generalized gamma (GG) distributions. The connection between these two different families is that the hazard functions of both have the four standard shapes (increasing, decreasing, bathtub, and arc shaped), and in fact, the shape of the hazard is the same for identical values of the three parameters. For a given EW distribution, we define a matching GG using simulation and also by matching the 5 (th) , 50 (th) , and 95 (th) percentiles. We compare EW and matching GG distributions graphically and using the Kullback-Leibler distance. We find that the survival functions for the EW and matching GG are graphically indistinguishable, and only the hazard functions can sometimes be seen to be slightly different. The Kullback-Leibler distances are very small and decrease with increasing sample size. We conclude that the similarity between the two distributions is striking, and therefore, the EW represents a convenient alternative to the GG with the identical richness of hazard behavior. More importantly, these results suggest that having the four basic hazard shapes may to some extent be an important structural characteristic of any family of distributions. PMID:24700647

  4. Weibull analysis applied to the pull adhesion test and fracture of a metal-ceramic interface

    SciTech Connect

    Erck, R.A.; Nichols, F.A.; Schult, D.L.

    1992-11-01

    Various adhesion tests have been developed to measure the mechanical bonding of thin coatings deposited on substrates. In the pull test, pins that have been bonded to the coating under test are pulled with increasing force normal to the coating until the coating is pulled from the substrate. For many systems, large scatter in the data is often observed due to uncontrolled defects in the interface and the brittle nature of the pull test. In this study, the applicability of Weibull statistics to the analysis of adhesion of Ag films to vacuum sputter-cleaned zirconia was examined. Data were obtained for smooth and rough substrates for various levels of adhesion. A good fit of the data to the Weibull distribution was observed. The Weibull modulus was found to depend on the roughness of the substrate, but was insensitive to the adhesion strength.

  5. Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

    2012-01-01

    A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

  6. Least Squares Best Fit Method for the Three Parameter Weibull Distribution: Analysis of Tensile and Bend Specimens with Volume or Surface Flaw Failure

    NASA Technical Reports Server (NTRS)

    Gross, Bernard

    1996-01-01

    Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

  7. A Weibull distribution with power-law tails that describes the first passage time processes of foreign currency exchanges

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya; Inoue, Jun-Ichi

    2007-03-01

    A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.

  8. Predictive Failure of Cylindrical Coatings Using Weibull Analysis

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2002-01-01

    Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.

  9. Numerical approach for the evaluation of Weibull distribution parameters for hydrologic purposes

    NASA Astrophysics Data System (ADS)

    Pierleoni, A.; Di Francesco, S.; Biscarini, C.; Manciola, P.

    2016-06-01

    In hydrology, the statistical description of low flow phenomena is very important in order to evaluate the available water resource especially in a river and the related values can be obviously considered as random variables, therefore probability distributions dealing with extreme values (maximum and/or minimum) of the variable play a fundamental role. Computational procedures for the estimation of the parameters featuring these distributions are actually very useful especially when embedded into analysis software [1][2] or as standalone applications. In this paper a computational procedure for the evaluation of the Weibull[3] distribution is presented focusing on the case when the lower limit of the distribution is not known or not set to a specific value a priori. The procedure takes advantage of the Gumbel[4] moment approach to the problem.

  10. Composite Weibull-Inverse Transformed Gamma distribution and its actuarial application

    NASA Astrophysics Data System (ADS)

    Maghsoudi, Mastoureh; Bakar, Shaiful Anuar Abu; Hamzah, Nor Aishah

    2014-07-01

    This paper introduces a new composite model, namely, composite Weibull-Inverse Transformed Gamma distribution which assumes Weibull distribution for the head up to a specified threshold and inverse transformed gamma distribution beyond it. The closed form of probability density function (pdf) as well as the estimation of parameters by maximum likelihood method is presented. The model is compared with several benchmark distributions and their performances are measured. A well-known data set, Danish fire loss data, is used for this purpose and it's Value at Risk (VaR) using the new model is computed. In comparison to several standard models, the composite Weibull- Inverse Transformed Gamma model proved to be a competitor candidate.

  11. Characterizing size dependence of ceramic-fiber strength using modified Weibull distribution

    SciTech Connect

    Zhu, Yuntian; Blumenthal, W.R.

    1995-05-01

    The strengths of ceramic fibers have been observed to increase with decreasing fiber diameter and length. The traditional single-modal Weibull distribution function can only take into account one type of flaw, which makes it inappropriate to characterize the strength dependence of both the diameter and the length since ceramic fibers usually have both volume and surface flaws which affect the strength dependence in different ways. Although the bi-modal Weibull distribution can be used to characterize both volume and surface flaws, the mathematical difficulty in its application makes it undesirable. In this paper, the factors affecting fiber strength are analyzed in terms of fracture mechanics and flaw formation. A modified Weibull distribution function is proposed to characterize both the diameter dependence and the length dependence of ceramic fibers.

  12. Weibull statistical analysis of Krouse type bending fatigue of nuclear materials

    NASA Astrophysics Data System (ADS)

    Haidyrah, Ahmed S.; Newkirk, Joseph W.; Castaño, Carlos H.

    2016-03-01

    A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S-N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.

  13. A new, reliable, and simple-to-use method for the analysis of a population of values of a random variable using the Weibull probability distribution: application to acrylic bone cement fatigue results.

    PubMed

    Janna, Sied; Dwiggins, David P; Lewis, Gladius

    2005-01-01

    In cases where the Weibull probability distribution is being investigated as a possible fit to experimentally obtained results of a random variable (V), there is, currently, no accurate and reliable but simple-to-use method available for simultaneously (a) establishing if the fit is of the two- or three-parameter variant of the distribution, and/or (b) estimating the minimum value of the variable (V(0)), in cases where the three-parameter variant is shown to be applicable. In the present work, the details of such a method -- which uses a simple nonlinear regression analysis -- are presented, together with results of its use when applied to 4 sets of number-of-cycles-to-fracture results from fatigue tests, performed in our laboratory, using specimens fabricated from 3 different acrylic bone cement formulations. The key result of the method is that the two- or three-parameter variant of the probability distribution is applicable if the estimate of V(0) obtained is less than or greater than zero, respectively. PMID:16179755

  14. Flexural strength of sapphire: Weibull statistical analysis of stressed area, surface coating, and polishing procedure effects

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2004-09-01

    The results of fracture testing are usually reported in terms of a measured strength, σM=σi¯±Δσi¯, where σi¯ is the average of the recorded peak stresses at failure, and Δσi¯ represents the standard deviation. This "strength" does not provide an objective measure of the intrisic strength since σM depends on the test method and the size of the volume or the surface subjected to tensile stresses. We first clarify issues relating to Weibull's theory of brittle fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on a variety of sapphire specimens, at three mechanical test facilities. Specifically, we describe the failure probability distribution in terms of a characteristic strength σC—i.e., the effective strength of a uniformly stressed 1cm2 area—which allows us to predict the average stress at failure of a uniformly loaded "window" if the Weibull modulus m is available. A Weibull statistical analysis of biaxial-flexure strength data thus amounts to obtaining the parameters σC and m, which is best done by directly fitting estimated cumulative failure probabilities to the appropriate expression derived from Weibull's theory. We demonstrate that: (a) measurements performed on sapphire test specimens originating from two suppliers confirm the applicability of the area scaling law; for mechanically polished c- and r-plane sapphire, we obtain σC≃975MPa, m =3.40 and σC≃550MPa, m =4.10, respectively. (b) Strongly adhering compressive coatings can augment the characteristic strength by as much as 60%, in accord with predictions based on fracture-mechanics considerations, but degrade the Weibull modulus, which mitigates the benefit of this approach. And (c) Measurements performed at 600°C on chemomechanically polished c-plane test specimens indicate that proper procedures may enhance the characteristic strength by as much as 150%, with no apparent degradation of the Weibull modulus.

  15. An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

    ERIC Educational Resources Information Center

    Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

    2005-01-01

    An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

  16. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength.

    PubMed

    Krumbholz, Michael; Hieronymus, Christoph F; Burchardt, Steffi; Troll, Valentin R; Tanner, David C; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth's crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

  17. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

    PubMed Central

    Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

  18. A modified Weibull model for tensile strength distribution of carbon nanotube fibers with strain rate and size effects

    NASA Astrophysics Data System (ADS)

    Sun, Gengzhi; Pang, John H. L.; Zhou, Jinyuan; Zhang, Yani; Zhan, Zhaoyao; Zheng, Lianxi

    2012-09-01

    Fundamental studies on the effects of strain rate and size on the distribution of tensile strength of carbon nanotube (CNT) fibers are reported in this paper. Experimental data show that the mechanical strength of CNT fibers increases from 0.2 to 0.8 GPa as the strain rate increases from 0.00001 to 0.1 (1/s). In addition, the influence of fiber diameter at low and high strain rate conditions was investigated further with statistical analysis. A modified Weibull distribution model for characterizing the tensile strength distribution of CNT fibers taking into account the effect of strain rate and fiber diameter is proposed.

  19. Standard practice for reporting uniaxial strength data and estimating Weibull distribution parameters for advanced ceramics

    NASA Astrophysics Data System (ADS)

    1994-04-01

    This practice covers the evaluation and subsequent reporting of uniaxial strength data and the estimation of probability distribution parameters for advanced ceramics that fail in a brittle fashion. The failure strength of advanced ceramics is treated as a continuous random variable. Typically, a number of test specimens with well-defined geometry are failed under well-defined isothermal loading conditions. The load at which each specimen fails is recorded. The resulting failure stresses are used to obtain parameter estimates associated with the underlying population distribution. This practice is restricted to the assumption that the distribution underlying the failure strengths is the two parameter Weibull distribution with size scaling. Furthermore, this practice is restricted to test specimens (tensile, flexural, pressurized ring, etc.) that are primarily subjected to uniaxial stress states. Section 8 outlines methods to correct for bias errors in the estimated Weibull parameters and to calculate confidence bounds on those estimates from data sets where all failures originate from a single flaw population (that is, a single failure mode). In samples where failures originate from multiple independent flaw populations (for example, competing failure modes), the methods outlined in Section 8 for bias correction and confidence bounds are not applicable. Measurements of the strength at failure are taken for one of two reasons: either for a comparison of the relative quality of two materials, or the prediction of the probability of failure (or, alternatively, the fracture strength) for a structure of interest. This practice will permit estimates of the distribution parameters that are needed for either.

  20. Bonus-Malus System with the Claim Frequency Distribution is Geometric and the Severity Distribution is Truncated Weibull

    NASA Astrophysics Data System (ADS)

    Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.

    2016-01-01

    Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.

  1. Weibull probability graph paper: a call for standardization

    NASA Astrophysics Data System (ADS)

    Kane, Martin D.

    2001-04-01

    Weibull analysis of tensile strength data is routinely performed to determine the quality of optical fiber. A typical Weibull analysis includes setting up an experiment, testing the samples, plotting and interpreting the data, and performing a statistical analysis. One typical plot that is often included in the analysis is the Weibull probability plot in which the data are plotted as points on a special type of graph paper known as Weibull probability paper. If the data belong to a Weibull probability density function, they will fall approximately on a straight line. A search of the literature reveals that many Weibull analyses have been performed on optical fiber, but the associated Weibull probability plots have been drawn incorrectly. In some instances the plots have been shown with the ordinate (Probability) starting from 0% and ending at 100%. This has no physical meaning because the Weibull probability density function is a continuous distribution and is inherently not bounded. This paper will discuss the Weibull probability density function, the proper construction of Weibull probability graph paper, and interpretation of data through analysis of the associated probability plot.

  2. Reliability Evaluation Method with Weibull Distribution for Temporary Overvoltages of Substation Equipment

    NASA Astrophysics Data System (ADS)

    Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun

    The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.

  3. Flexural strength of infrared-transmitting window materials: bimodal Weibull statistical analysis

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2011-02-01

    The results of flexural strength testing performed on brittle materials are usually interpreted in light of a ``Weibull plot,'' i.e., by fitting the estimated cumulative failure probability (CFP) to a linearized semiempirical Weibull distribution. This procedure ignores the impact of the testing method on the measured stresses at fracture--specifically, the stressed area and the stress profile--thus resulting in inadequate characterization of the material under investigation. In a previous publication, the author reformulated Weibull's statistical theory of fracture in a manner that emphasizes how the stressed area and the stress profile control the failure probability distribution, which led to the concept of a characteristic strength, that is, the effective strength of a 1-cm2 uniformly stressed area. Fitting the CFP of IR-transmitting materials (AlON, fusion-cast CaF2, oxyfluoride glass, fused SiO2, CVD-ZnSe, and CVD-ZnS) was performed by means of nonlinear regressions but produced evidence of slight, systematic deviations. The purpose of this contribution is to demonstrate that upon extending the previously elaborated model to distributions involving two distinct types of defects--bimodal distributions--the fit agrees with estimated CFPs. Furthermore, the availability of two sets of statistical parameters (characteristic strength and shape parameter) can be taken advantage of to evaluate the failure-probability density, thus providing means of assessing the nature, the critical size, and the size distribution of surface/subsurface flaws.

  4. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.

    2007-01-01

    Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.

  5. USE OF WEIBULL FUNCTION FOR NON-LINEAR ANALYSIS OF EFFECTS OF LOW LEVELS OF SIMULATED HERBICIDE DRIFT ON PLANTS

    EPA Science Inventory

    We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

  6. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2013-01-01

    Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

  7. Average capacity for optical wireless communication systems over exponentiated Weibull distribution non-Kolmogorov turbulent channels.

    PubMed

    Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

    2014-06-20

    We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels. PMID:24979434

  8. Accurate bearing remaining useful life prediction based on Weibull distribution and artificial neural network

    NASA Astrophysics Data System (ADS)

    Ben Ali, Jaouher; Chebel-Morello, Brigitte; Saidi, Lotfi; Malinowski, Simon; Fnaiech, Farhat

    2015-05-01

    Accurate remaining useful life (RUL) prediction of critical assets is an important challenge in condition based maintenance to improve reliability and decrease machine's breakdown and maintenance's cost. Bearing is one of the most important components in industries which need to be monitored and the user should predict its RUL. The challenge of this study is to propose an original feature able to evaluate the health state of bearings and to estimate their RUL by Prognostics and Health Management (PHM) techniques. In this paper, the proposed method is based on the data-driven prognostic approach. The combination of Simplified Fuzzy Adaptive Resonance Theory Map (SFAM) neural network and Weibull distribution (WD) is explored. WD is used just in the training phase to fit measurement and to avoid areas of fluctuation in the time domain. SFAM training process is based on fitted measurements at present and previous inspection time points as input. However, the SFAM testing process is based on real measurements at present and previous inspections. Thanks to the fuzzy learning process, SFAM has an important ability and a good performance to learn nonlinear time series. As output, seven classes are defined; healthy bearing and six states for bearing degradation. In order to find the optimal RUL prediction, a smoothing phase is proposed in this paper. Experimental results show that the proposed method can reliably predict the RUL of rolling element bearings (REBs) based on vibration signals. The proposed prediction approach can be applied to prognostic other various mechanical assets.

  9. Detecting changes in retinal function: Analysis with Non-Stationary Weibull Error Regression and Spatial enhancement (ANSWERS).

    PubMed

    Zhu, Haogang; Russell, Richard A; Saunders, Luke J; Ceccon, Stefano; Garway-Heath, David F; Crabb, David P

    2014-01-01

    Visual fields measured with standard automated perimetry are a benchmark test for determining retinal function in ocular pathologies such as glaucoma. Their monitoring over time is crucial in detecting change in disease course and, therefore, in prompting clinical intervention and defining endpoints in clinical trials of new therapies. However, conventional change detection methods do not take into account non-stationary measurement variability or spatial correlation present in these measures. An inferential statistical model, denoted 'Analysis with Non-Stationary Weibull Error Regression and Spatial enhancement' (ANSWERS), was proposed. In contrast to commonly used ordinary linear regression models, which assume normally distributed errors, ANSWERS incorporates non-stationary variability modelled as a mixture of Weibull distributions. Spatial correlation of measurements was also included into the model using a Bayesian framework. It was evaluated using a large dataset of visual field measurements acquired from electronic health records, and was compared with other widely used methods for detecting deterioration in retinal function. ANSWERS was able to detect deterioration significantly earlier than conventional methods, at matched false positive rates. Statistical sensitivity in detecting deterioration was also significantly better, especially in short time series. Furthermore, the spatial correlation utilised in ANSWERS was shown to improve the ability to detect deterioration, compared to equivalent models without spatial correlation, especially in short follow-up series. ANSWERS is a new efficient method for detecting changes in retinal function. It allows for better detection of change, more efficient endpoints and can potentially shorten the time in clinical trials for new therapies. PMID:24465636

  10. Inferences on the lifetime performance index for Weibull distribution based on censored observations using the max p-value method

    NASA Astrophysics Data System (ADS)

    Lee, Wen-Chuan

    2011-06-01

    In the service (or manufacturing) industries, process capability indices (PCIs) are utilised to assess whether product quality meets the required level. And the lifetime performance index (or larger-the-better PCI) CL is frequently used as a means of measuring product performance, where L is the lower specification limit. Hence, this study first uses the max p-value method to select the optimum value of the shape parameter β of the Weibull distribution and β is given. Second, we also construct the maximum likelihood estimator (MLE) of CL based on the type II right-censored sample from the Weibull distribution. The MLE of CL is then utilised to develop a novel hypothesis testing procedure provided that L is known. Finally, we give one practical example to illustrate the use of the testing procedure under given significance level α.

  11. Improvement in mechanical properties of jute fibres through mild alkali treatment as demonstrated by utilisation of the Weibull distribution model.

    PubMed

    Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam

    2012-03-01

    Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%. PMID:22209134

  12. Detecting Changes in Retinal Function: Analysis with Non-Stationary Weibull Error Regression and Spatial Enhancement (ANSWERS)

    PubMed Central

    Zhu, Haogang; Russell, Richard A.; Saunders, Luke J.; Ceccon, Stefano; Garway-Heath, David F.; Crabb, David P.

    2014-01-01

    Visual fields measured with standard automated perimetry are a benchmark test for determining retinal function in ocular pathologies such as glaucoma. Their monitoring over time is crucial in detecting change in disease course and, therefore, in prompting clinical intervention and defining endpoints in clinical trials of new therapies. However, conventional change detection methods do not take into account non-stationary measurement variability or spatial correlation present in these measures. An inferential statistical model, denoted ‘Analysis with Non-Stationary Weibull Error Regression and Spatial enhancement’ (ANSWERS), was proposed. In contrast to commonly used ordinary linear regression models, which assume normally distributed errors, ANSWERS incorporates non-stationary variability modelled as a mixture of Weibull distributions. Spatial correlation of measurements was also included into the model using a Bayesian framework. It was evaluated using a large dataset of visual field measurements acquired from electronic health records, and was compared with other widely used methods for detecting deterioration in retinal function. ANSWERS was able to detect deterioration significantly earlier than conventional methods, at matched false positive rates. Statistical sensitivity in detecting deterioration was also significantly better, especially in short time series. Furthermore, the spatial correlation utilised in ANSWERS was shown to improve the ability to detect deterioration, compared to equivalent models without spatial correlation, especially in short follow-up series. ANSWERS is a new efficient method for detecting changes in retinal function. It allows for better detection of change, more efficient endpoints and can potentially shorten the time in clinical trials for new therapies. PMID:24465636

  13. An incentive for coordination in a decentralised service chain with a Weibull lifetime distributed facility

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Fang; Yang, Gino K.; Yang, Chyn-Yng; Chu, Tu-Bin

    2013-10-01

    This article deals with a decentralised service chain consisting of a service provider and a facility owner. The revenue allocation and service price are, respectively, determined by the service provider and the facility owner in a non-cooperative manner. To model this decentralised operation, a Stackelberg game between the two parties is formulated. In the mathematical framework, the service system is assumed to be driven by Poisson customer arrivals and exponential service times. The most common log-linear service demand and Weibull facility lifetime are also adopted. Under these analytical conditions, the decentralised decisions in this game are investigated and then a unique optimal equilibrium is derived. Finally, a coordination mechanism is proposed to improve the efficiency of this decentralised system.

  14. Weibull Analysis of Fracture Test Data on Bovine Cortical Bone: Influence of Orientation

    PubMed Central

    Ekwaro-Osire, Stephen

    2013-01-01

    The fracture toughness, KIC, of a cortical bone has been experimentally determined by several researchers. The variation of KIC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone. PMID:24385985

  15. Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.

    PubMed

    Khandaker, Morshed; Ekwaro-Osire, Stephen

    2013-01-01

    The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone. PMID:24385985

  16. How to do a Weibull statistical analysis of flexural strength data: application to AlON, diamond, zinc selenide, and zinc sulfide

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.; Miller, Richard P.

    2001-09-01

    For the purpose of assessing the strength of engineering ceramics, it is common practice to interpret the measured stresses at fracture in the light of a semi-empirical expression derived from Weibull's theory of brittle fracture, i.e., ln[-ln(1-P)]=-mln((sigma) N)+mln((sigma) ), where P is the cumulative failure probability, (sigma) is the applied tensile stress, m is the Weibull modulus, and (sigma) N is the nominal strength. The strength of (sigma) N, however, does not represent a true measure because it depends not only on the test method but also on the size of the volume or the surface subjected to tensile stresses. In this paper we intend to first clarify issues relating to the application of Weibull's theory of fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on polycrystalline infrared-transmitting materials. These materials are brittle ceramics, which most frequently fail as a consequence of tensile stresses acting on surface flaws. Since equibiaxial flexure testing is the preferred method of measuring the strength of optical ceramics, we propose to formulate the failure-probability equation in terms of a characteristic strength, (sigma) C, for biaxial loadings, i.e., P=1-exp{-(pi) (ro/cm)2[(Gamma) (1+1/m)]m((sigma) /(sigma) C)m}, where ro is the radius of the loading ring (in centimeter) and (Gamma) (z) designates the gamma function. A Weibull statistical analysis of equibiaxial strength data thus amounts to obtaining the parameters m and (sigma) C, which is best done by directly fitting estimated Pi vs i data to the failure-probability equation; this procedure avoids distorting the distribution through logarithmic linearization and can be implemented by performing a non-linear bivariate regression. Concentric- ring fracture testing performed on five sets of Raytran materials validates the procedure in the sense that the two parameters model appears to describe the experimental failure

  17. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  18. Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model: A complementary data set

    PubMed Central

    Jo, Byung Wan; Chakraborty, Sumit; Kim, Heon

    2015-01-01

    This data article provides a comparison data for nano-cement based concrete (NCC) and ordinary Portland cement based concrete (OPCC). Concrete samples (OPCC) were fabricated using ten different mix design and their characterization data is provided here. Optimization of curing time using the Weibull distribution model was done by analyzing the rate of change of compressive strength of the OPCC. Initially, the compressive strength of the OPCC samples was measured after completion of four desired curing times. Thereafter, the required curing time to achieve a particular rate of change of the compressive strength has been predicted utilizing the equation derived from the variation of the rate of change of compressive strength with the curing time, prior to the optimization of the curing time (at the 99.99% confidence level) using the Weibull distribution model. This data article complements the research article entitled “Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model” [1]. PMID:26217804

  19. Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model: A complementary data set.

    PubMed

    Jo, Byung Wan; Chakraborty, Sumit; Kim, Heon

    2015-09-01

    This data article provides a comparison data for nano-cement based concrete (NCC) and ordinary Portland cement based concrete (OPCC). Concrete samples (OPCC) were fabricated using ten different mix design and their characterization data is provided here. Optimization of curing time using the Weibull distribution model was done by analyzing the rate of change of compressive strength of the OPCC. Initially, the compressive strength of the OPCC samples was measured after completion of four desired curing times. Thereafter, the required curing time to achieve a particular rate of change of the compressive strength has been predicted utilizing the equation derived from the variation of the rate of change of compressive strength with the curing time, prior to the optimization of the curing time (at the 99.99% confidence level) using the Weibull distribution model. This data article complements the research article entitled "Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model" [1]. PMID:26217804

  20. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning.

    PubMed

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  1. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning

    PubMed Central

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  2. SER performance analysis of MPPM FSO system with three decision thresholds over exponentiated Weibull fading channels

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Yang, Bensheng; Guo, Lixin; Shang, Tao

    2015-11-01

    In this work, the symbol error rate (SER) performance of the multiple pulse position modulation (MPPM) based free-space optical communication (FSO) system with three different decision thresholds, fixed decision threshold (FDT), optimized decision threshold (ODT) and dynamic decision threshold (DDT) over exponentiated Weibull (EW) fading channels has been investigated in detail. The effects of aperture averaging on each decision threshold under weak-to-strong turbulence conditions are further studied and compared. The closed-form SER expressions for three thresholds derived with the help of generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulations. This work is helpful for the design of receivers for FSO communication systems.

  3. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    SciTech Connect

    Vachon, W.A.

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  4. Weibull distribution of incipient flaws in basalt material used in high-velocity impact experiments and applications in numerical simulations of small body disruptions

    NASA Astrophysics Data System (ADS)

    Michel, P.; Nakamura, A.

    We measured the Weibull parameters of a specific basalt material, called Yakuno basalt, which has already been used in documented high-velocity impact experiments. The outcomes of these experiments have been widely used to validate numerical codes of fragmentation developed in the context of planetary science. However, the distribution of incipient flaws in the targets, usually characterized by the so-called Weibull parameters, have generally be implemented in the codes with values allowing to match the experimental outcomes, hence the validity of numerical simulations remains to be assessed with the actual values of these parameters. Here, we follow the original method proposed by Weibull in 1939 to measure these parameters for this Yakuno basalt. We obtain a value of the Weibull modulus (also called shape parameter) m larger than the one corresponding to simulation fits to the experimental data. The characteristic strength, which corresponds to 63.2 % of failure of a sample of similar specimens and which defines the second Weibull or scale parameter is also determined. This parameter seems not sensitive to the different loading rates used to make the measurements. A complete database of impact experiments on basalt targets, including both the important initial target parameters and the detailed outcome of their disruptions, is now at the disposal of numerical codes of fragmentation for validity test. In the gravity regime, which takes place when the small bodies involved are larger than a few hundreds of meters in size, our numerical simulations have already been successful to reproduce asteroid families, showing that most large fragments from an asteroid disruption consist of gravitational aggregates formed by re-accumulation of smaller fragments during the disruption. Moreover, we found that the outcome depends strongly on the initial internal structure of the bodies involved. Therefore, the knowledge of the actual flaw distribution of the material defining the

  5. Analysis of the fuzzy greatest of CFAR detector in homogeneous and non-homogeneous Weibull clutter title

    NASA Astrophysics Data System (ADS)

    Baadeche, Mohamed; Soltani, Faouzi

    2015-12-01

    In this paper, we analyze the distributed FGO-CFAR detector in homogeneous and Non-Homogeneous Weibull clutter with an assumption of known shape parameter. The non-homogeneity is modeled by the presence of a clutter edge in the reference window. We derive membership function which maps the observations to the false alarm space and compute the threshold at the data fusion center. Applying the `Maximum', `Minimum', `Algebraic Sum' and `Algebraic Product' fuzzy rules for L detectors considered at the data fusion center, the obtained results showed that the best performance is obtained by the `Algebraic Product' fuzzy rule followed by the `Minimum' one and in these two cases the probability of detection increases significantly with the number of detectors.

  6. Estimating Age Distributions of Base Flow in Watersheds Underlain by Single and Dual Porosity Formations Using Groundwater Transport Simulation and Weighted Weibull Functions

    NASA Astrophysics Data System (ADS)

    Sanford, W. E.

    2015-12-01

    Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially

  7. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  8. Weibull Wind-Speed Distribution Parameters Derived from a Combination of Wind-Lidar and Tall-Mast Measurements Over Land, Coastal and Marine Sites

    NASA Astrophysics Data System (ADS)

    Gryning, Sven-Erik; Floors, Rogier; Peña, Alfredo; Batchvarova, Ekaterina; Brümmer, Burghard

    2016-05-01

    Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is expressed in terms of mean wind speed and Weibull distribution shape-parameter profiles. The consequences of the carrier-to-noise-ratio ( CNR) threshold-value choice on the wind-lidar observations are revealed as follows. When the wind-lidar CNR is lower than a prescribed threshold value, the observations are often filtered out as the uncertainty in the wind-speed measurements increases. For a pulsed heterodyne Doppler lidar, use of the traditional -22 dB CNR threshold value at all measuring levels up to 600 m results in a ≈ 7 % overestimation in the long-term mean wind speed over land, and a ≈ 12 % overestimation in coastal and marine environments. In addition, the height of the profile maximum of the shape parameter of the Weibull distribution (so-called reversal height) is found to depend on the applied CNR threshold; it is found to be lower at small CNR threshold values. The reversal height is greater in the suburban (high roughness) than in the rural (low roughness) area. In coastal areas the reversal height is lower than that over land and relates to the internal boundary layer that develops downwind from the coastline. Over the sea the shape parameter increases towards the sea surface. A parametrization of the vertical profile of the shape parameter fits well with observations over land, coastal regions and over the sea. An applied model for the dependence of the reversal height on the surface roughness is in good agreement with the observations over land.

  9. Simulation of correlated discrete Weibull variables: A proposal and an implementation in the R environment

    NASA Astrophysics Data System (ADS)

    Barbiero, Alessandro

    2015-12-01

    Researchers in applied sciences are often concerned with multivariate random variables. In particular, multivariate discrete data often arise in many fields (statistical quality control, biostatistics, failure analysis, etc). Here we consider the discrete Weibull distribution as an alternative to the popular Poisson random variable and propose a procedure for simulating correlated discrete Weibull random variables, with marginal distributions and correlation matrix assigned by the user. The procedure indeed relies upon the gaussian copula model and an iterative algorithm for recovering the proper correlation matrix for the copula ensuring the desired correlation matrix on the discrete margins. A simulation study is presented, which empirically shows the performance of the procedure.

  10. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Shantaram, S. Pai; Gyekenyesi, John P.

    1989-01-01

    The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  11. Robust Fitting of a Weibull Model with Optional Censoring

    PubMed Central

    Yang, Jingjing; Scott, David W.

    2013-01-01

    The Weibull family is widely used to model failure data, or lifetime data, although the classical two-parameter Weibull distribution is limited to positive data and monotone failure rate. The parameters of the Weibull model are commonly obtained by maximum likelihood estimation; however, it is well-known that this estimator is not robust when dealing with contaminated data. A new robust procedure is introduced to fit a Weibull model by using L2 distance, i.e. integrated square distance, of the Weibull probability density function. The Weibull model is augmented with a weight parameter to robustly deal with contaminated data. Results comparing a maximum likelihood estimator with an L2 estimator are given in this article, based on both simulated and real data sets. It is shown that this new L2 parametric estimation method is more robust and does a better job than maximum likelihood in the newly proposed Weibull model when data are contaminated. The same preference for L2 distance criterion and the new Weibull model also happens for right-censored data with contamination. PMID:23888090

  12. EVALUATION OF SPRING OPERATED RELIEF VALVE MAINTENANCE INTERVALS AND EXTENSION OF MAINTENANCE TIMES USING A WEIBULL ANALYSIS WITH MODIFIED BAYESIAN UPDATING

    SciTech Connect

    Harris, S.; Gross, R.; Mitchell, E.

    2011-01-18

    The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

  13. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  14. Finite-size effects on return interval distributions for weakest-link-scaling systems.

    PubMed

    Hristopulos, Dionissios T; Petrakis, Manolis P; Kaniadakis, Giorgio

    2014-05-01

    The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the κ-Weibull distribution. The upper tail of the κ-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the κ-Weibull distribution decreases linearly after a waiting time τ(c) ∝ n(1/m), where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the κ Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the κ-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems. PMID:25353774

  15. Experimental evaluation of the strength distribution of E-glass fibres at high strain rates

    NASA Astrophysics Data System (ADS)

    Wang, Zhen

    1995-07-01

    A bimodal Weibull distribution function was applied to analyse the strength distribution of glass fibre bundles under tensile impact. The simulation was performed using a one-dimensional damage constitutive model. The results show that there were two concurrent flaw populations in the fracture process. The regression analysis using the bimodal Weibull distribution function was in good agreement with experiment.

  16. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. PMID:26121186

  17. Modeling root reinforcement using root-failure Weibull survival function

    NASA Astrophysics Data System (ADS)

    Schwarz, M.; Giadrossich, F.; Cohen, D.

    2013-03-01

    Root networks contribute to slope stability through complicated interactions that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamic of root turnover, the quantification of root reinforcement on steep slope is challenging and consequently the calculation of slope stability as well. Although the considerable advances in root reinforcement modeling, some important aspect remain neglected. In this study we address in particular to the role of root strength variability on the mechanical behaviors of a root bundle. Many factors may contribute to the variability of root mechanical properties even considering a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field datasets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the tensile force and the elasticity of the roots are the most important equations, as well as the root distribution. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for

  18. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  19. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    PubMed

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. PMID:26202323

  20. Shallow Flaws Under Biaxial Loading Conditions, Part II: Application of a Weibull Stress Analysis of the Cruciform Bend Specimen Using a Hydrostatic Stress Criterion

    SciTech Connect

    Bass, B.R.; McAfee, W.J.; Williams, P.T.

    1999-08-01

    Cruciform beam fracture mechanics specimensl have been developed in the Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far- field, out-of-plane biaxird bending stress component in the test section that approximates the nonlinear biaxial stresses resulting from pressurized-thernxd-shock or pressure-temperature loading of a nuclear reactor pressure vessel (RPV). Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shtdlow, surface flaws. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. Two and three- parameter Weibull models have been calibrated using a new scheme (developed at the University of Illinois) that maps toughness data from test specimens with distinctly different levels of crack-tip constraint to a small scale yielding (SSY) Weibull stress space. These models, using the new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the OW integral definition, have been shown to correlate the experimentally observed biaxiaI effect in cruciform specimens, thereby providing a scaling mechanism between uniaxial and biaxial loading states.

  1. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Gyekenyesi, John P.

    1988-01-01

    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  2. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  3. A comparison of Weibull and. beta. sub Ic analyses of transition range data

    SciTech Connect

    McCabe, D.E.

    1991-01-01

    Specimen size effects on K{sub Jc} data scatter in the transition range of fracture toughness have been explained by external (weakest link) statistics. In this investigation, compact specimens of A 533 grade B steel were tested in sizes ranging from 1/2TC(T) to 4TC(T) with sufficient replication to obtain good three-parameter Weibull characterization of data distributions. The optimum fitting parameters for an assumed Weibull slope of 4 were calculated. External statistics analysis was applied to the 1/2TC(T) data to predict median K{sub Jc} values for 1TC(T), 2TC(T), and 4TC(T) specimens. The distributions from experimentally developed 1TC(T), 2TC(T), and 4TC(T) data tended to confirm the predictions. However, the extremal prediction model does not work well at lower-shelf toughness. At {minus}150{degree}C the extremal model predicts a specimen size effect where in reality there is no size effect.

  4. Bias in the Weibull Strength Estimation of a SiC Fiber for the Small Gauge Length Case

    NASA Astrophysics Data System (ADS)

    Morimoto, Tetsuya; Nakagawa, Satoshi; Ogihara, Shinji

    It is known that the single-modal Weibull model describes well the size effect of brittle fiber tensile strength. However, some ceramic fibers have been reported that single-modal Weibull model provided biased estimation on the gauge length dependence. A hypothesis on the bias is that the density of critical defects is very small, thus, fracture probability of small gauge length samples distributes in discrete manner, which makes the Weibull parameters dependent on the gauge length. Tyranno ZMI Si-Zr-C-O fiber has been selected as an example fiber. The tensile tests have been done on several gauge lengths. The derived Weibull parameters have shown a dependence on the gauge length. Fracture surfaces were observed with SEM. Then we classified the fracture surfaces into the characteristic fracture patterns. Percentage of each fracture pattern was found dependent on the gauge length, too. This may be an important factor of the Weibull parameter dependence on the gauge length.

  5. Measuring the Weibull modulus of microscope slides

    NASA Technical Reports Server (NTRS)

    Sorensen, Carl D.

    1992-01-01

    The objectives are that students will understand why a three-point bending test is used for ceramic specimens, learn how Weibull statistics are used to measure the strength of brittle materials, and appreciate the amount of variation in the strength of brittle materials with low Weibull modulus. They will understand how the modulus of rupture is used to represent the strength of specimens in a three-point bend test. In addition, students will learn that a logarithmic transformation can be used to convert an exponent into the slope of a straight line. The experimental procedures are explained.

  6. Distributed analysis at LHCb

    NASA Astrophysics Data System (ADS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart; LHCb Collaboration

    2011-12-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  7. Transmission overhaul and replacement predictions using Weibull and renewal theory

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1989-01-01

    A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

  8. Transmission overhaul and replacement predictions using Weibull and renewel theory

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1989-01-01

    A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

  9. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGESBeta

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  10. Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2014-07-01

    This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ≥ 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (≥0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

  11. Distributed analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  12. Modeling root reinforcement using a root-failure Weibull survival function

    NASA Astrophysics Data System (ADS)

    Schwarz, M.; Giadrossich, F.; Cohen, D.

    2013-11-01

    Root networks contribute to slope stability through complex interactions with soil that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamics of root turnover, the quantification of root reinforcement on steep slopes is challenging and consequently the calculation of slope stability also. Although considerable progress has been made, some important aspects of root mechanics remain neglected. In this study we address specifically the role of root-strength variability on the mechanical behavior of a root bundle. Many factors contribute to the variability of root mechanical properties even within a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field data sets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the equations of the tensile force, the elasticity of the roots, and the root distribution are the most important steps. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for tensile

  13. Structural characterization of genomes by large scale sequence-structure threading: application of reliability analysis in structural genomics

    PubMed Central

    Cherkasov, Artem; Ho Sui, Shannan J; Brunham, Robert C; Jones, Steven JM

    2004-01-01

    Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter β for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics. PMID:15274750

  14. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  15. Survival extrapolation using the poly-Weibull model

    PubMed Central

    Lunn, David; Sharples, Linda D

    2015-01-01

    Recent studies of (cost-) effectiveness in cardiothoracic transplantation have required estimation of mean survival over the lifetime of the recipients. In order to calculate mean survival, the complete survivor curve is required but is often not fully observed, so that survival extrapolation is necessary. After transplantation, the hazard function is bathtub-shaped, reflecting latent competing risks which operate additively in overlapping time periods. The poly-Weibull distribution is a flexible parametric model that may be used to extrapolate survival and has a natural competing risks interpretation. In addition, treatment effects and subgroups can be modelled separately for each component of risk. We describe the model and develop inference procedures using freely available software. The methods are applied to two problems from cardiothoracic transplantation. PMID:21937472

  16. Survival extrapolation using the poly-Weibull model.

    PubMed

    Demiris, Nikolaos; Lunn, David; Sharples, Linda D

    2015-04-01

    Recent studies of (cost-) effectiveness in cardiothoracic transplantation have required estimation of mean survival over the lifetime of the recipients. In order to calculate mean survival, the complete survivor curve is required but is often not fully observed, so that survival extrapolation is necessary. After transplantation, the hazard function is bathtub-shaped, reflecting latent competing risks which operate additively in overlapping time periods. The poly-Weibull distribution is a flexible parametric model that may be used to extrapolate survival and has a natural competing risks interpretation. In addition, treatment effects and subgroups can be modelled separately for each component of risk. We describe the model and develop inference procedures using freely available software. The methods are applied to two problems from cardiothoracic transplantation. PMID:21937472

  17. Fracture strength of ultrananocrystalline diamond thin films—identification of Weibull parameters

    NASA Astrophysics Data System (ADS)

    Espinosa, H. D.; Peng, B.; Prorok, B. C.; Moldovan, N.; Auciello, O.; Carlisle, J. A.; Gruen, D. M.; Mancini, D. C.

    2003-11-01

    The fracture strength of ultrananocrystalline diamond (UNCD) has been investigated using tensile testing of freestanding submicron films. Specifically, the fracture strength of UNCD membranes, grown by microwave plasma chemical vapor deposition (MPCVD), was measured using the membrane deflection experiment developed by Espinosa and co-workers. The data show that fracture strength follows a Weibull distribution. Furthermore, we show that the Weibull parameters are highly dependent on the seeding process used in the growth of the films. When seeding was performed with microsized diamond particles, using mechanical polishing, the stress resulting in a probability of failure of 63% was found to be 1.74 GPa, and the Weibull modulus was 5.74. By contrast, when seeding was performed with nanosized diamond particles, using ultrasonic agitation, the stress resulting in a probability of failure of 63%, increased to 4.13 GPa, and the Weibull modulus was 10.76. The tests also provided the elastic modulus of UNCD, which was found to vary from 940 to 970 GPa for both micro- and nanoseeding. The investigation highlights the role of microfabrication defects on material properties and reliability, as a function of seeding technique, when identical MPCVD chemistry is employed. The parameters identified in this study are expected to aid the designer of microelectromechanical systems devices employing UNCD films.

  18. Weibull parameters of Yakuno basalt targets used in documented high-velocity impact experiments

    NASA Astrophysics Data System (ADS)

    Nakamura, Akiko M.; Michel, Patrick; Setoh, Masato

    2007-02-01

    In this paper we describe our measurements of the Weibull parameters of a specific basalt material, called Yakuno basalt, which was used in documented high-velocity impact experiments. The outcomes of these experiments have been widely used to validate numerical codes of fragmentation developed in the context of planetary science. However, the distribution of incipient flaws in the targets, usually characterized by the Weibull parameters, has generally been implemented in the codes with values allowing to match the experimental outcomes; hence the validity of numerical simulations remains to be assessed with the actual values of these parameters from laboratory measurements. Here we follow the original method proposed by Weibull in 1939 to measure these parameters for this Yakuno basalt. We obtain a value of the Weibull modulus (also called shape parameter) m in the range 15-17 with a typical error of about 1.0 for each different trial. This value is larger than the one corresponding to simulation fits to the experimental data, generally around 9.5. The characteristic strength, which corresponds to 63.2% of failure of a sample of similar specimens and which defines the second Weibull or scale parameter, is estimated to be 19.3-19.4 MPa with a typical error of about 0.05 MPa. This parameter seems to not be sensitive to the different loading rates used to make the measurements. A complete database of impact experiments on basalt targets, including both the important initial target parameters and the detailed outcome of their disruptions, is now at the disposal of numerical codes of fragmentation for validity test.

  19. Collective Weibull behavior of social atoms: Application of the rank-ordering statistics to historical extreme events

    NASA Astrophysics Data System (ADS)

    Chen, Chien-Chih; Tseng, Chih-Yuan; Telesca, Luciano; Chi, Sung-Ching; Sun, Li-Chung

    2012-02-01

    Analogous to crustal earthquakes in natural fault systems, we here consider the dynasty collapses as extreme events in human society. Duration data of ancient Chinese and Egyptian dynasties provides a good chance of exploring the collective behavior of the so-called social atoms. By means of the rank-ordering statistics, we demonstrate that the duration data of those ancient dynasties could be described with good accuracy by the Weibull distribution. It is thus amazing that the distribution of time to failure of human society, i.e. the disorder of a historical dynasty, follows the widely accepted Weibull process as natural material fails.

  20. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  1. A Weibull brittle material failure model for the ABAQUS computer program

    SciTech Connect

    Bennett, J.

    1991-08-01

    A statistical failure theory for brittle materials that traces its origins to the Weibull distribution function is developed for use in the general purpose ABAQUS finite element computer program. One of the fundamental assumptions for this development is that Mode 1 microfractures perpendicular to the direction of the principal stress contribute independently to the fast fracture. The theory is implemented by a user subroutine for ABAQUS. Example problems illustrating the capability and accuracy of the model are given. 24 refs., 12 figs.

  2. The distribution of first-passage times and durations in FOREX and future markets

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  3. Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.

    PubMed

    Phoenix, S Leigh; Newman, William I

    2009-12-01

    Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/2distribution and a

  4. Time-dependent fiber bundles with local load sharing. II. General Weibull fibers

    NASA Astrophysics Data System (ADS)

    Phoenix, S. Leigh; Newman, William I.

    2009-12-01

    Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ρ , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, β . Thus the failure rate of a fiber depends on its past load history, except for β=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (β,ρ) pairs that yield contrasting behavior for large N . For ρ>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N→∞ , unlike ELS, which yields a finite limiting mean. For 1/2≤ρ≤1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ρ=1 ) with an asymptotic Gaussian lifetime distribution and a

  5. Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1991-01-01

    Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

  6. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  7. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  8. Characteristic tensile strength and Weibull shape parameter of carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2007-06-01

    Recently, it has been argued [N. M. Pugno and R. S. Ruoff, J. Appl. Phys. 99, 024301 (2006)] that available carbon-nanotube (CNT) tensile strength data do not obey the "classical" Weibull statistical model. In this paper we formulate Weibull's theory in a manner suitable for assessing CNT fracture-strength data and demonstrate that, on taking into account the area S subjected to uniform tensile stresses, the data are consistent with Weibull's model. Based on available data, a characteristic strength σC (S=1μm2) equal to 17.6±2.5GPa in conjunction with a shape parameter m equal to 2.77±0.34 provides a good description of the CNT fracture strength. In terms of effective strengths, and on assuming that the relevant area-scaling laws apply, carbon nanotubes and diamond nanofilms exhibit similar features for stressed areas ranging from 1to104μm2.

  9. Strength analysis of yttria-stabilized tetragonal zirconia polycrystals

    SciTech Connect

    Noguchi, K.; Matsuda, Y.; Oishi, M. ); Masaki, T.; Nakayama, S.; Mizushina, M. )

    1990-09-01

    This paper reports the tensile strength of Y{sub 2}O{sub 3}-stabilized ZrO{sub 2} polycrystals (Y-TZP) measured by a newly developed tensile testing method with a rectangular bar. The tensile strength of Y-TZP was lower than that of the three-point bend strength, and the shape of the tensile strength distribution was quite different from that of the three-point bend strength distribution. It was difficult to predict the distribution curve of the tensile strength using the data of the three-point bend strength by one-modal Weibull distribution. The distribution of the tensile strength was analyzed by two- or three-modal Weibull distribution coupled with an analysis of fracture origins. The distribution curve of the three-point bend strength which was estimated by multimodal Weibull distribution agreed favorably with that of the measured three-point bend strength values. A two-modal Weibull distribution function was formulated approximately from the distributions of the tensile and three-point bend strengths, and the estimated two-modal Weibull distribution function for the four-point bend strength agreed well with the measured four-point bend strength.

  10. Incorporating finite element analysis into component life and reliability

    NASA Technical Reports Server (NTRS)

    August, Richard; Zaretsky, Erwin V.

    1991-01-01

    A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

  11. The ATLAS distributed analysis system

    NASA Astrophysics Data System (ADS)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  12. Application of Weibull Criterion to failure prediction in compsites

    SciTech Connect

    Cain, W. D.; Knight, Jr., C. E.

    1981-04-20

    Fiber-reinforced composite materials are being widely used in engineered structures. This report examines how the general form of the Weibull Criterion, including the evaluation of the parameters and the scaling of the parameter values, can be used for the prediction of component failure.

  13. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  14. Weibull Statistics for Upper Ocean Currents with the Fokker-Planck Equation

    NASA Astrophysics Data System (ADS)

    Chu, P. C.

    2012-12-01

    Upper oceans typically exhibit of a surface mixed layer with a thickness of a few to several hundred meters. This mixed layer is a key component in studies of climate, biological productivity and marine pollution. It is the link between the atmosphere and the deep ocean and directly affects the air-sea exchange of heat, momentum and gases. Vertically averaged horizontal currents across the mixed layer are driven by the residual between the Ekman transport and surface wind stress, and damped by the Rayleigh friction. A set of stochastic differential equations are derived for the two components of the current vector (u, v). The joint probability distribution function of (u, v) satisfies the Fokker-Planck equation (Chu, 2008, 2009), with the Weibull distribution as the solution for the current speed. To prove it, the PDF of the upper (0-50 m) tropical Pacific current speeds (w) was calculated from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project. In fact, it satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies. Besides, the Weibull distribution is also identified from the

  15. Distributional Cost-Effectiveness Analysis

    PubMed Central

    Asaria, Miqdad; Griffin, Susan; Cookson, Richard

    2015-01-01

    Distributional cost-effectiveness analysis (DCEA) is a framework for incorporating health inequality concerns into the economic evaluation of health sector interventions. In this tutorial, we describe the technical details of how to conduct DCEA, using an illustrative example comparing alternative ways of implementing the National Health Service (NHS) Bowel Cancer Screening Programme (BCSP). The 2 key stages in DCEA are 1) modeling social distributions of health associated with different interventions, and 2) evaluating social distributions of health with respect to the dual objectives of improving total population health and reducing unfair health inequality. As well as describing the technical methods used, we also identify the data requirements and the social value judgments that have to be made. Finally, we demonstrate the use of sensitivity analyses to explore the impacts of alternative modeling assumptions and social value judgments. PMID:25908564

  16. Distribution-free discriminant analysis

    SciTech Connect

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  17. Spatial and Temporal Patterns of Global Onshore Wind Speed Distribution

    SciTech Connect

    Zhou, Yuyu; Smith, Steven J.

    2013-09-09

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/CFSR reanalysis data. The estimated Weibull distribution performs well in fitting the time series wind speed data at the global level according to R2, root mean square error, and power density error. The spatial, decadal, and seasonal patterns of wind speed distribution were then evaluated. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in substantial errors. While large-scale wind speed data is often presented in the form of average wind speeds, these results highlight the need to also provide information on the wind speed distribution.

  18. Weibull models of fracture strengths and fatigue behavior of dental resins in flexure and shear.

    PubMed

    Baran, G R; McCool, J I; Paul, D; Boberick, K; Wunder, S

    1998-01-01

    In estimating lifetimes of dental restorative materials, it is useful to have available data on the fatigue behavior of these materials. Current efforts at estimation include several untested assumptions related to the equivalence of flaw distributions sampled by shear, tensile, and compressive stresses. Environmental influences on material properties are not accounted for, and it is unclear if fatigue limits exist. In this study, the shear and flexural strengths of three resins used as matrices in dental restorative composite materials were characterized by Weibull parameters. It was found that shear strengths were lower than flexural strengths, liquid sorption had a profound effect on characteristic strengths, and the Weibull shape parameter obtained from shear data differed for some materials from that obtained in flexure. In shear and flexural fatigue, a power law relationship applied for up to 250,000 cycles; no fatigue limits were found, and the data thus imply only one flaw population is responsible for failure. Again, liquid sorption adversely affected strength levels in most materials (decreasing shear strengths and flexural strengths by factors of 2-3) and to a greater extent than did the degree of cure or material chemistry. PMID:9730059

  19. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  20. Comment on ``On the tensile strength distribution of multiwalled carbon nanotubes'' [Appl. Phys. Lett. 87, 203106 (2005)

    NASA Astrophysics Data System (ADS)

    Lu, Chunsheng

    2008-05-01

    In a recent letter, Barber, Andrews, Schadler, and Wagner, Appl. Phys. Lett. 87, 203106 (2005). indicated that Weibull-Poisson statistics could accurately model the nanotube tensile strength data, and then concluded that the apparent strengthening mechanism in a multiwalled carbon nanotube (MWCNT) grown by chemical vapor deposition (CVD) is most likely caused by an enhanced interaction between the walls of the nanotube. In this comment, we show that their conclusion seems to be inconsistent with the assumption introduced in the data analysis by using a two-parameter Weibull distribution. Further statistical analysis provides a new explanation on the scattered strengths of MWCNTs. The effectiveness of Weibull-Poisson statistics at nanoscales is also discussed.

  1. Investigation on the lifetime of He--Ne lasers by means of Weibull function

    SciTech Connect

    Wang Xishan; Sun Zhendong

    1987-04-01

    The failure mechanism of He-Ne lasers is compared with the physical model of the Weibull function. It follows that the lifetime of He-Ne lasers ought to obey Weibull function. An equation for accelerated aging is derived, which is used to determine readily the lifetime characteristics of He-Ne lasers.

  2. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2008-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  3. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  4. Characteristic strength, Weibull modulus, and failure probability of fused silica glass

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2009-11-01

    The development of high-energy lasers has focused attention on the requirement to assess the mechanical strength of optical components made of fused silica or fused quartz (SiO2). The strength of this material is known to be highly dependent on the stressed area and the surface finish, but has not yet been properly characterized in the published literature. Recently, Detrio and collaborators at the University of Dayton Research Institute (UDRI) performed extensive ring-on-ring flexural strength measurements on fused SiO2 specimens ranging in size from 1 to 9 in. in diameter and of widely differing surface qualities. We report on a Weibull statistical analysis of the UDRI data-an analysis based on the procedure outlined in Proc. SPIE 4375, 241 (2001). We demonstrate that (1) a two-parameter Weibull model, including the area-scaling principle, applies; (2) the shape parameter (m~=10) is essentially independent of the stressed area as well as the surface finish; and (3) the characteristic strength (1-cm2 uniformly stressed area) obeys a linear law, σC (in megapascals) ~=160-2.83×PBS (in parts per million per steradian), where PBS characterizes the surface/subsurface ``damage'' of an appropriate set of test specimens. In this light, we evaluate the cumulative failure probability and the failure probability density of polished and superpolished fused SiO2 windows as a function of the biaxial tensile stress, for stressed areas ranging from 0.3 to 100 cm2.

  5. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  6. DASH---Distributed Analysis System Hierarchy

    NASA Astrophysics Data System (ADS)

    Yagi, M.; Mizumoto, Y.; Yoshida, M.; Kosugi, G.; Takata, T.; Ogasawara, R.; Ishihara, Y.; Morita, Y.; Nakamoto, H.; Watanabe, N.

    We developed the Distributed Analysis Software Hierarchy (DASH), an object-oriented data reduction and data analysis system for efficient processing of data from the SUBARU telescope. DASH consists of many objects (data management objects, reduction engines, GUIs, etc.) distributed on CORBA. We have also developed SASH, a stand-alone system which has the same interface as DASH, but which does not use some of the distributed services such as DA/DB; visiting astronomers can detach PROCube out of DASH and continue the analysis with SASH at their home institute. SASH will be used as a quick reduction tool at the summit.

  7. Brain responses strongly correlate with Weibull image statistics when processing natural images.

    PubMed

    Scholte, H Steven; Ghebreab, Sennay; Waldorp, Lourens; Smeulders, Arnold W M; Lamme, Victor A F

    2009-01-01

    The visual appearance of natural scenes is governed by a surprisingly simple hidden structure. The distributions of contrast values in natural images generally follow a Weibull distribution, with beta and gamma as free parameters. Beta and gamma seem to structure the space of natural images in an ecologically meaningful way, in particular with respect to the fragmentation and texture similarity within an image. Since it is often assumed that the brain exploits structural regularities in natural image statistics to efficiently encode and analyze visual input, we here ask ourselves whether the brain approximates the beta and gamma values underlying the contrast distributions of natural images. We present a model that shows that beta and gamma can be easily estimated from the outputs of X-cells and Y-cells. In addition, we covaried the EEG responses of subjects viewing natural images with the beta and gamma values of those images. We show that beta and gamma explain up to 71% of the variance of the early ERP signal, substantially outperforming other tested contrast measurements. This suggests that the brain is strongly tuned to the image's beta and gamma values, potentially providing the visual system with an efficient way to rapidly classify incoming images on the basis of omnipresent low-level natural image statistics. PMID:19757938

  8. Effects of dislocation density and sample-size on plastic yielding at the nanoscale: a Weibull-like framework

    NASA Astrophysics Data System (ADS)

    Rinaldi, Antonio

    2011-11-01

    Micro-compression tests have demonstrated that plastic yielding in nanoscale pillars is the result of the fine interplay between the sample-size (chiefly the diameter D) and the density of bulk dislocations ρ. The power-law scaling typical of the nanoscale stems from a source-limited regime, which depends on both these sample parameters. Based on the experimental and theoretical results available in the literature, this paper offers a perspective about the joint effect of D and ρ on the yield stress in any plastic regime, promoting also a schematic graphical map of it. In the sample-size dependent regime, such dependence is cast mathematically into a first order Weibull-type theory, where the power-law scaling the power exponent β and the modulus m of an approximate (unimodal) Weibull distribution of source-strengths can be related by a simple inverse proportionality. As a corollary, the scaling exponent β may not be a universal number, as speculated in the literature. In this context, the discussion opens the alternative possibility of more general (multimodal) source-strength distributions, which could produce more complex and realistic strengthening patterns than the single power-law usually assumed. The paper re-examines our own experimental data, as well as results of Bei et al. (2008) on Mo-alloy pillars, especially for the sake of emphasizing the significance of a sudden increase in sample response scatter as a warning signal of an incipient source-limited regime.

  9. Accuracy analysis of distributed simulation systems

    NASA Astrophysics Data System (ADS)

    Lin, Qi; Guo, Jing

    2010-08-01

    Existed simulation works always emphasize on procedural verification, which put too much focus on the simulation models instead of simulation itself. As a result, researches on improving simulation accuracy are always limited in individual aspects. As accuracy is the key in simulation credibility assessment and fidelity study, it is important to give an all-round discussion of the accuracy of distributed simulation systems themselves. First, the major elements of distributed simulation systems are summarized, which can be used as the specific basis of definition, classification and description of accuracy of distributed simulation systems. In Part 2, the framework of accuracy of distributed simulation systems is presented in a comprehensive way, which makes it more sensible to analyze and assess the uncertainty of distributed simulation systems. The concept of accuracy of distributed simulation systems is divided into 4 other factors and analyzed respectively further more in Part 3. In Part 4, based on the formalized description of framework of accuracy analysis in distributed simulation systems, the practical approach are put forward, which can be applied to study unexpected or inaccurate simulation results. Following this, a real distributed simulation system based on HLA is taken as an example to verify the usefulness of the approach proposed. The results show that the method works well and is applicable in accuracy analysis of distributed simulation systems.

  10. Analysis of distribution of critical current of bent-damaged Bi2223 composite tape

    NASA Astrophysics Data System (ADS)

    Ochiai, S.; Okuda, H.; Sugano, M.; Hojo, M.; Osamura, K.; Kuroda, T.; Kumakura, H.; Kitaguchi, H.; Itoh, K.; Wada, H.

    2011-10-01

    Distributions of critical current of damaged Bi2223 tape specimens bent by 0.6, 0.8 and 1.0% were investigated analytically with a modelling approach based on the correlation of damage evolution to distribution of critical current. It was revealed that the distribution of critical current is described by three parameter Weibull distribution function through the distribution of the tensile damage strain of Bi2223 filaments that determines the damage front in bent-composite tape. Also it was shown that the measured distribution of critical current values can be reproduced successfully by a Monte Carlo simulation using the distributions of tensile damage strain of filaments and original critical current.

  11. Analysis of cascade impactor mass distributions.

    PubMed

    Dunbar, Craig; Mitchell, Jolyon

    2005-01-01

    The purpose of this paper is to review the approaches for analyzing cascade impactor (CI) mass distributions produced by pulmonary drug products and the considerations necessary for selecting the appropriate analysis procedure. There are several methods available for analyzing CI data, yielding a hierarchy of information in terms of nominal, ordinal and continuous variables. Mass distributions analyzed as a nominal function of the stages and auxiliary components is the simplest approach for examining the whole mass emitted by the inhaler. However, the relationship between the mass distribution and aerodynamic diameter is not described by such data. This relationship is a critical attribute of pulmonary drug products due to the association between aerodynamic diameter and the mass of particulates deposited to the respiratory tract. Therefore, the nominal mass distribution can only be utilized to make decisions on the discrete masses collected in the CI. Mass distributions analyzed as an ordinal function of aerodynamic diameter can be obtained by introducing the stage size range, which generally vary in magnitude from one stage to another for a given type of CI, and differ between CIs of different designs. Furthermore, the mass collected by specific size ranges within the CI are often incorrectly used to estimate in vivo deposition at various regions of the respiratory tract. A CI-generated mass distribution can be directly related to aerodynamic diameter by expressing the mass collected by each size-fractionating stage in terms of either mass frequency or cumulative mass fraction less than the aerodynamic size appropriate to each stage. Analysis of the aerodynamic diameter as a continuous variable allows comparison of mass distributions obtained from different products, obtained by different CI designs, as well as providing input to in vivo particle deposition models. The lack of information about the mass fraction emitted by the inhaler that is not size-analyzed by

  12. Distributional Cost-Effectiveness Analysis: A Tutorial.

    PubMed

    Asaria, Miqdad; Griffin, Susan; Cookson, Richard

    2016-01-01

    Distributional cost-effectiveness analysis (DCEA) is a framework for incorporating health inequality concerns into the economic evaluation of health sector interventions. In this tutorial, we describe the technical details of how to conduct DCEA, using an illustrative example comparing alternative ways of implementing the National Health Service (NHS) Bowel Cancer Screening Programme (BCSP). The 2 key stages in DCEA are 1) modeling social distributions of health associated with different interventions, and 2) evaluating social distributions of health with respect to the dual objectives of improving total population health and reducing unfair health inequality. As well as describing the technical methods used, we also identify the data requirements and the social value judgments that have to be made. Finally, we demonstrate the use of sensitivity analyses to explore the impacts of alternative modeling assumptions and social value judgments. PMID:25908564

  13. Comparison of the Weibull characteristics of hydroxyapatite and strontium doped hydroxyapatite.

    PubMed

    Yatongchai, Chokchai; Wren, Anthony W; Curran, Declan J; Hornez, Jean-Christophe; Mark R, Towler

    2013-05-01

    The effects of two strontium (Sr) additions, 5% and 10% of the total calcium (Ca) content, on the phase assemblage and Weibull statistics of hydroxyapatite (HA) are investigated and compared to those of undoped HA. Sintering was carried out in the range of 900-1200 °C in steps of 1000 °C in a conventional furnace. Sr content had little effect on the mean particulate size. Decomposition of the HA phase occurred with Sr incorporation, while β-TCP stabilization was shown to occur with 10% Sr additions. Porosity in both sets of doped samples was at a comparable level to porosity in the undoped HA samples, however the 5% Sr-HA samples displayed the greatest reduction in porosity with increasing temperature while the porosity of the 10% Sr-HA samples remain relatively constant over the full sintering temperature range. The undoped HA samples displayed the greatest Weibull strengths and the porosity was determined to be the major controlling factor. However, with the introduction of decompositional phases in the Sr-HA samples, the dependence of strength on porosity is reduced and the phase assemblage becomes the more dominant factor for Weibull strength. The Weibull modulus is relatively independent of the porosity in the undoped HA samples. The 5% Sr-HA samples experience a slight increase in Weibull modulus with porosity, indicating a possible relationship between the parameters. However the 10% Sr-HA samples show the highest Weibull modulus with a value of approximately 15 across all sintering temperatures. It is postulated that this is due to the increased amount of surface and lattice diffusion that these samples undergo, which effectively smooths out flaws in the microstructure, due to a saturation of Sr content occurring in grain boundary movement. PMID:23524073

  14. Distributed analysis in ATLAS using GANGA

    NASA Astrophysics Data System (ADS)

    Elmsheuser, Johannes; Brochu, Frederic; Cowan, Greig; Egede, Ulrik; Gaidioz, Benjamin; Lee, Hurng-Chun; Maier, Andrew; Móscicki, Jakub; Pajchel, Katarina; Reece, Will; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Vanderster, Daniel; Williams, Michael

    2010-04-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  15. A Novel Conditional Probability Density Distribution Surface for the Analysis of the Drop Life of Solder Joints Under Board Level Drop Impact

    NASA Astrophysics Data System (ADS)

    Gu, Jian; Lei, YongPing; Lin, Jian; Fu, HanGuang; Wu, Zhongwei

    2016-01-01

    The scattering of fatigue life data is a common problem and usually described using the normal distribution or Weibull distribution. For solder joints under drop impact, due to the complicated stress distribution, the relationship between the stress and the drop life is so far unknown. Furthermore, it is important to establish a function describing the change in standard deviation for solder joints under different drop impact levels. Therefore, in this study, a novel conditional probability density distribution surface (CPDDS) was established for the analysis of the drop life of solder joints. The relationship between the drop impact acceleration and the drop life is proposed, which comprehensively considers the stress distribution. A novel exponential model was adopted for describing the change of the standard deviation with the impact acceleration (0 → +∞). To validate the model, the drop life of Sn-3.0Ag-0.5Cu solder joints was analyzed. The probability density curve of the logarithm of the fatigue life distribution can be easily obtained for a certain acceleration level fixed on the acceleration level axis of the CPDDS. The P- A- N curve was also obtained using the functions μ( A) and σ( A), which can reflect the regularity of the life data for an overall reliability P.

  16. Analysis and Modelling of Extreme Wind Speed Distributions in Complex Mountainous Regions

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Kanevski, Mikhail

    2016-04-01

    Modelling of wind speed distributions in complex mountainous regions is an important and challenging problem which interests many scientists from several fields. In the present research, high frequency (10 min) Swiss wind speed monitoring data (IDAWEB service, Meteosuisse) are analysed and modelled with different parametric distributions (Weibull, GEV, Gamma, etc.) using maximum likelihood method. In total, 111 stations placed in different geomorphological units and at different altitude (from 203 to 3580 meters) are studied. Then, this information is used for training machine learning algorithms (Extreme Learning Machines, Support vector machine) to predict the distribution at new places, potentially useful for aeolian energy generation. An important part of the research deals with the construction and application of a high dimensional input feature space, generated from digital elevation model. A comprehensive study was carried out using feature selection approach to get the best model for the prediction. The main results are presented as spatial patterns of distributions' parameters.

  17. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  18. Analysis of Jingdong Mall Logistics Distribution Model

    NASA Astrophysics Data System (ADS)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  19. EXPERIMENTAL DESIGN STRATEGY FOR THE WEIBULL DOSE RESPONSE MODEL (JOURNAL VERSION)

    EPA Science Inventory

    The objective of the research was to determine optimum design point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose response relationship follows the Weibull. The optimum design is dependent on the values of the We...

  20. DASH--distributed analysis system hierarchy

    NASA Astrophysics Data System (ADS)

    Yagi, Masafumi; Yoshihiko, Mizumoto; Ogasawara, Ryusuke; Kosugi, George; Takata, Tadafumi; Ishihara, Yasuhide; Yokono, Yasunori; Morita, Yasuhiro; Nakamoto, Hiroyuki; Watanabe, Noboru; Ukawa, Kentaro

    2002-12-01

    We have developed and are operating an object-oriented data reduction and data analysis system, DASH ( Distributed Analysis Software Hierarchy ), for efficient data processing for SUBARU telescope. In DASH, all information for reducing a set of data is packed into an abstracted object, named as ``Hierarchy''. It contains rules how to search calibration data, reduction procedure to the final result, and also the reduction log. With Hierarchy, DASH works as an automated reduction pipeline platform cooperated with STARS (Subaru Telescope ARchive System). DASH is implemented on CORBA and Java technology. The portability of these technology enables us to make a subset of the system for a small stand-alone system, SASH. SASH is compatible with DASH and one can continuously reduce and analyze data between DASH and SASH.

  1. Analysis and control of distributed cooperative systems.

    SciTech Connect

    Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

    2004-09-01

    As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

  2. Rectangular shape distributed piezoelectric actuator: analytical analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bohua; Qiu, Yan

    2004-04-01

    This paper is focused on the development of distributed piezoelectric actuators (DPAs) with rectangular shapes by using PZT materials. Analytical models of rectangular shape DPAs have been constructed in order to analyse and test the performance of DPA products. Firstly, based on the theory of electromagnetics, DPAs have been considered as a type of capacitor. The charge distributed density on the interdigitated electrodes (IDEs), which has been applied in the actuators, and the capacitance of the DPAs have been calculated. The accurate distribution and intensity of electrical field in DPA element have also been calculated completely. Secondly, based on the piezoelectric constitutive relations and the compound plates theory, models for mechanical strain and stress fields of DPAs have been developed, and the performances of rectangular shape DPAs have been discussed. Finally, on the basis of the models that have been developed in this paper, an improved design of a rectangular shape DPA has been discussed and summed up. Due to the minimum hypotheses that have been used during the processes of calculation, the characteristics of this paper are that the accurate distribution and intensity of electrical fields in DPAs have been concluded. The proposed accurate calculations have not been seen in the literature, and can be used in DPA design and manufacture processes in order to improve mechanical performance and reduce the cost of DPA products in further applications. In this paper, all the processes of analysis and calculation have been done by MATLAB and MathCAD. The FEM results used for comparison were obtained using the ABAQUS program.

  3. CMS distributed data analysis with CRAB3

    DOE PAGESBeta

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; et al

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less

  4. CMS distributed data analysis with CRAB3

    NASA Astrophysics Data System (ADS)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  5. CMS distributed data analysis with CRAB3

    SciTech Connect

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  6. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  7. A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes

    PubMed Central

    2015-01-01

    Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222

  8. Sensitivity analysis of distributed volcanic source inversion

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  9. Transmission and distribution-loss analysis

    SciTech Connect

    Not Available

    1982-05-01

    A previous study developed a methodology for determining the losses in the various elements of an electric utility transmission and distribution system using only generally published system data. In that study the losses at the system peak and the average annual losses of the Niagara Mohawk Power Corporation system were calculated to illustrate the methods. Since there was little or no system loss data available at that time, the methodology of the loss calculations was not verified. The purpose of this study was to verify the methods that were proposed in the previous study. The data, estimates, assumptions, and calculation methods of the original study were checked against the actual Niagara Mohawk system data. The losses calculated in the original study were compared to the system losses derived from actual system data. Revisions to the original methods were recommended to improve the accuracy of the results. As a result of the analysis done in this study, the methods developed in the original study were revised. The revised methods provide reasonable loss calculation results for the Niagara Mohawk system. These methods along with discussions of their application are given. Also included is a description of the procedures followed to find the system losses from the actual system data. The revised loss calculation methods using the published data based on the Niagara Mohawk system data, operation, and loadings, gave reasonable results for that system, and the method may be applicable to similar systems.

  10. On the gap between an empirical distribution and an exponential distribution of waiting times for price changes in a financial market

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya

    2007-03-01

    We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.

  11. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  12. Distributed Design and Analysis of Computer Experiments

    Energy Science and Technology Software Center (ESTSC)

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  13. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-08-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the Dist

  14. Survival Analysis of Patients with End Stage Renal Disease

    NASA Astrophysics Data System (ADS)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  15. Distributed energy store railguns experiment and analysis

    SciTech Connect

    Holland, L.D.

    1984-01-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. The distributed energy store railgun used multiple current sources connected to the rails of a railgun at points distributed along the bore. These current sources (energy stores) are turned on in sequence as the projectile moves down the bore so that current is fed to the railgun from behind the armature. In this system the length of the rails that carry the full armature current is less than the total length of the railgun. If a sufficient number of energy stores is used, this removes the limitation on the length of a railgun. An additional feature of distributed energy store type railguns is that they can be designed to maintain a constant pressure on the projectile being accelerated. A distributed energy store railgun was constructed and successfully operated. In addition to this first demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed.

  16. Analysis of Temperature Distributions in Nighttime Inversions

    NASA Astrophysics Data System (ADS)

    Telyak, Oksana; Krasouski, Aliaksandr; Svetashev, Alexander; Turishev, Leonid; Barodka, Siarhei

    2015-04-01

    Adequate prediction of temperature inversion in the atmospheric boundary layer is one of prerequisites for successful forecasting of meteorological parameters and severe weather events. Examples include surface air temperature and precipitation forecasting as well as prediction of fog, frosts and smog with hazardous levels of atmospheric pollution. At the same time, reliable forecasting of temperature inversions remains an unsolved problem. For prediction of nighttime inversions over some specific territory, it is important to study characteristic features of local circulation cells formation and to properly take local factors into account to develop custom modeling techniques for operational use. The present study aims to investigate and analyze vertical temperature distributions in tropospheric inversions (isotherms) over the territory of Belarus. We study several specific cases of formation, evolution and decay of deep nighttime temperature inversions in Belarus by means of mesoscale numerical simulations with WRF model, considering basic mechanisms of isothermal and inverse temperature layers formation in the troposphere and impact of these layers on local circulation cells. Our primary goal is to assess the feasibility of advance prediction of inversions formation with WRF. Modeling results reveal that all cases under consideration have characteristic features of radiative inversions (e.g., their formation times, development phases, inversion intensities, etc). Regions of "blocking" layers formation are extensive and often spread over the entire territory of Belarus. Inversions decay starts from the lowermost (near surface) layer (altitudes of 5 to 50 m). In all cases, one can observe formation of temperature gradients that substantially differ from the basic inversion gradient, i.e. the layer splits into smaller layers, each having a different temperature stratification (isothermal, adiabatic, etc). As opposed to various empirical techniques as well as

  17. Dentin bonding performance using Weibull statistics and evaluation of acid-base resistant zone formation of recently introduced adhesives.

    PubMed

    Guan, Rui; Takagaki, Tomohiro; Matsui, Naoko; Sato, Takaaki; Burrow, Michael F; Palamara, Joseph; Nikaido, Toru; Tagami, Junji

    2016-07-30

    Dentin bonding durability of recently introduced dental adhesives: Clearfil SE Bond 2 (SE2), Optibond XTR (XTR), and Scotchbond Universal (SBU) was investigated using Weibull analysis as well as analysis of the micromorphological features of the acid-base resistant zone (ABRZ) created for the adhesives. The bonding procedures of SBU were divided into three subgroups: self-etch (SBS), phosphoric acid (PA) etching on moist (SBM) or dry dentin (SBD). All groups were thermocycled for 0, 5,000 and 10,000 cycles followed by microtensile bond strength testing. Acid-base challenge was undertaken before SEM and TEM observations of the adhesive interface. The etch-and-rinse method with SBU (SBM and SBD) created inferior interfaces on the dentin surface which resulted in reduced bond durability. ABRZ formation was detected with the self-etch adhesive systems; SE2, XTR and SBS. In the PA etching protocols of SBM and SBD, a thick hybrid layer but no ABRZ was detected, which might affect dentin bond durability. PMID:27335136

  18. Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India

    NASA Astrophysics Data System (ADS)

    Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.

    2014-09-01

    The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.

  19. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  20. Integer sparse distributed memory: analysis and results.

    PubMed

    Snaider, Javier; Franklin, Stan; Strain, Steve; George, E Olusegun

    2013-10-01

    Sparse distributed memory is an auto-associative memory system that stores high dimensional Boolean vectors. Here we present an extension of the original SDM, the Integer SDM that uses modular arithmetic integer vectors rather than binary vectors. This extension preserves many of the desirable properties of the original SDM: auto-associativity, content addressability, distributed storage, and robustness over noisy inputs. In addition, it improves the representation capabilities of the memory and is more robust over normalization. It can also be extended to support forgetting and reliable sequence storage. We performed several simulations that test the noise robustness property and capacity of the memory. Theoretical analyses of the memory's fidelity and capacity are also presented. PMID:23747569

  1. Economic analysis of efficient distribution transformer trends

    SciTech Connect

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  2. Equity analysis of hospital beds distribution in Shiraz, Iran 2014

    PubMed Central

    Hatam, Nahid; Zakeri, Mohammadreza; Sadeghi, Ahmad; Darzi Ramandi, Sajad; Hayati, Ramin; Siavashi, Elham

    2016-01-01

    Background: One of the important aspects of equity in health is equality in the distribution of resources in this sector. The present study aimed to assess the distribution of hospital beds in Shiraz in 2014. Methods: In this retrospective cross-sectional study, the population density index and fair distribution of beds were analyzed by Lorenz curve and Gini coefficient, respectively. Descriptive data were analyzed using Excel software. We used Distributive Analysis Stata Package (DASP) in STATA software, version 12, for computing Gini coefficient and drawing Lorenz curve. Results: The Gini coefficient was 0.68 in the population. Besides, Gini coefficient of hospital beds’ distribution based on population density was 0.70, which represented inequality in the distribution of hospital bedsamong the nine regions of Shiraz. Conclusion: Although the total number of hospital beds was reasonable in Shiraz, distribution of these resources was not fair, and inequality was observed in their distribution among the nine regions of Shiraz. PMID:27579284

  3. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  4. Robust two-parameter invariant CFAR detection utilizing order statistics applied to Weibull clutter

    NASA Astrophysics Data System (ADS)

    Nagle, Daniel T.; Saniie, Jafar

    1992-08-01

    Constant False Alarm Rate (CFAR) detectors are designed to perform when the clutter information is partially unknown and/or varying. This is accomplished using local threshold estimates from background observations in which the CFAR level is maintained. However, when local observations contain target or irrelevant information, censoring is warranted to improve detection performance. Order Statistics (OS) processors have been shown to perform robustly (referring to type II errors or CFAR loss) for heterogeneous background clutter observations, and their performance has been analyzed for exponential clutter with unknown power. In this paper, several order statistics are used to create an invariant test statistic for Weibull clutter with two varying parameters (i.e., power and skewness). The robustness of a two-parameter invariant CFAR detector is analyzed and compared with an uncensored Weibull-Two Parameter (WTP) CFAR detector and conventional Cell Averaging (CA)-CFAR detector (i.e., designed invariant to exponential clutter). The performance trade-offs of these detectors are gaged for different scenarios of volatile clutter environments.

  5. Intensity distribution analysis of cathodoluminescence using the energy loss distribution of electrons.

    PubMed

    Fukuta, Masahiro; Inami, Wataru; Ono, Atsushi; Kawata, Yoshimasa

    2016-01-01

    We present an intensity distribution analysis of cathodoluminescence (CL) excited with a focused electron beam in a luminescent thin film. The energy loss distribution is applied to the developed analysis method in order to determine the arrangement of the dipole locations along the path of the electron traveling in the film. Propagating light emitted from each dipole is analyzed with the finite-difference time-domain (FDTD) method. CL distribution near the film surface is evaluated as a nanometric light source. It is found that a light source with 30 nm widths is generated in the film by the focused electron beam. We also discuss the accuracy of the developed analysis method by comparison with experimental results. The analysis results are brought into good agreement with the experimental results by introducing the energy loss distribution. PMID:26550930

  6. Statistical wind analysis for near-space applications

    NASA Astrophysics Data System (ADS)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  7. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    SciTech Connect

    Sun, Huarui Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin

    2015-01-26

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.

  8. Distributed bearing fault diagnosis based on vibration analysis

    NASA Astrophysics Data System (ADS)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  9. Charge distribution analysis of catalysts under simulated reaction conditions

    SciTech Connect

    Freund, F.

    1992-01-01

    Charge Distribution Analysis (CDA) is a technique for measuring mobile charge carriers in dielectric materials. CDA is based on dielectric polarization in an electric field gradient. The CDA apparatus is now under construction. 3 figs.

  10. Precipitator inlet particulate distribution flow analysis

    SciTech Connect

    LaRose, J.A.; Averill, A.

    1994-12-31

    The B and W Rothemuhle precipitators located at PacifiCorp`s Wyodak Generating Station in Gillette, Wyoming have, for the past two years, been experiencing discharge wire breakage. The breakage is due to corrosion of the wires: however, the exact cause of the corrosion is unknown. One aspect thought to contribute to the problem is an unbalance of ash loading among the four precipitators. Plant operation has revealed that the ash loading to precipitator C appears to be the heaviest of the four casing, and also appears to have the most severe corrosion. Data from field measurements showed that the gas flows to the four precipitators are fairly uniform, within {+-}9% of the average. The ash loading data showed a large maldistribution among the precipitators. Precipitator C receives 60% more ash than the next heaviest loaded precipitator. A numerical model was created which showed the same results. The model was then utilized to determine design modifications to the existing flue and turning vanes to improve the ash loading distribution. The resulting design was predicted to improve the ash loading to all the precipitators, within {+-}10% of the average.

  11. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  12. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  13. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  14. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  15. A Distributed, Parallel Visualization and Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-more » dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.« less

  16. Performance analysis of static locking in distributed database systems

    SciTech Connect

    Shyu, S.C. ); Li, V.O.K. . Dept. of Electrical Engineering)

    1990-06-01

    Numerous performance models have been proposed for locking algorithms in centralized database systems, but few have been developed for distributed ones. Existing results on distributed locking usually ignore the deadlock problem so as to simplify the analysis. In this paper, a new performance model for static locking in distributed database systems is developed.A queuing model is used to approximate static locking in distributed database systems without deadlocks. Then a random graph model is proposed to find the deadlock probability of each transaction. The above two models are integrated, so that given the transaction arrival rate, the response time and the effective throughput can be calculated.

  17. Analysis of the irregular planar distribution of proteins in membranes.

    PubMed

    Hui, S W; Frank, J

    1985-03-01

    Methods to characterize the irregular but non-random planar distribution of proteins in biological membranes were investigated. The distribution of the proteins constituting the intramembranous particles (IMP) in human erythrocyte membranes was used as an example. The distribution of IMPs was deliberately altered by experimental means. For real space analyses, the IMP positions in freeze fracture micrograph S were determined by an automatic procedure described. Radial distribution and autocorrelation analysis revealed quantitative differences between experimental groups. These methods are more sensitive than the corresponding optical diffraction or Fourier-Bessel analyses of the same IMP distribution data, due to the inability of the diffraction methods to separate contrast and distribution effects. A method to identify IMPs on a non-uniform background is described. PMID:3999133

  18. Distributed transit compartments for arbitrary lifespan distributions in aging populations.

    PubMed

    Koch, Gilbert; Schropp, Johannes

    2015-09-01

    Transit compartment models (TCM) are often used to describe aging populations where every individual has its own lifespan. However, in the TCM approach these lifespans are gamma-distributed which is a serious limitation because often the Weibull or more complex distributions are realistic. Therefore, we extend the TCM concept to approximately describe any lifespan distribution and call this generalized concept distributed transit compartment models (DTCMs). The validity of DTCMs is obtained by convergence investigations. From the mechanistic perspective the transit rates are directly controlled by the lifespan distribution. Further, DTCMs could be used to approximate the convolution of a signal with a probability density function. As example a stimulatory effect of a drug in an aging population with a Weibull-distributed lifespan is presented where distribution and model parameters are estimated based on simulated data. PMID:26100181

  19. Modeling and analysis of solar distributed generation

    NASA Astrophysics Data System (ADS)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  20. Near field light intensity distribution analysis in bimodal polymer waveguide

    NASA Astrophysics Data System (ADS)

    Herzog, T.; Gut, K.

    2015-12-01

    The paper presents analysis of light intensity distribution and sensitivity in differential interferometer based on bimodal polymer waveguide. Key part is analysis of optimal waveguide layer thickness in structure SiO2/SU-8/H2O for maximum bulk refractive index sensitivity. The paper presents new approach to detecting phase difference between modes through registrations only part of energy propagating in the waveguide. Additionally in this paper the analysis of changes in light distribution when energy in modes is not equal were performed.

  1. Effect of Porosity on Strength Distribution of Microcrystalline Cellulose.

    PubMed

    Keleṣ, Özgür; Barcenas, Nicholas P; Sprys, Daniel H; Bowman, Keith J

    2015-12-01

    Fracture strength of pharmaceutical compacts varies even for nominally identical samples, which directly affects compaction, comminution, and tablet dosage forms. However, the relationships between porosity and mechanical behavior of compacts are not clear. Here, the effects of porosity on fracture strength and fracture statistics of microcrystalline cellulose compacts were investigated through diametral compression tests. Weibull modulus, a key parameter in Weibull statistics, was observed to decrease with increasing porosity from 17 to 56 vol.%, based on eight sets of compacts at different porosity levels, each set containing ∼ 50 samples, a total of 407 tests. Normal distribution fits better to fracture data for porosity less than 20 vol.%, whereas Weibull distribution is a better fit in the limit of highest porosity. Weibull moduli from 840 unique finite element simulations of isotropic porous materials were compared to experimental Weibull moduli from this research and results on various pharmaceutical materials. Deviations from Weibull statistics are observed. The effect of porosity on fracture strength can be described by a recently proposed micromechanics-based formula. PMID:26022545

  2. ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS

    SciTech Connect

    Tuffner, Francis K.; Singh, Ruchi

    2011-08-09

    Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).

  3. Selection of neutrino burst candidates by pulse spatial distribution analysis

    NASA Astrophysics Data System (ADS)

    Ryasny, V. G.

    1996-02-01

    The method of analysis and possibilities of identification of neutrino bursts from collapsing stars using a spatial distribution of pulses in the multimodular installations, like the Large Volume Detector at the Gran Sasso Laboratory, Liquid Scintillation Detector (Mont Blanc) and Baksan Scintillation Telescope, are discussed. The method could be applicable for any position sensitive detector. By the spatial distribution analysis the burst imitation probability can be decreased by at least 2 orders of magnitude, without significant loss of sensitivity, for currently predicted number of the neutrino interactions.

  4. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  5. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  6. Discriminating topology in galaxy distributions using network analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungryong; Coutinho, Bruno C.; Dey, Arjun; Barabási, Albert-L.; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl

    2016-07-01

    The large-scale distribution of galaxies is generally analysed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a Lévy walk. For the cosmological simulation, we adopt the redshift z = 0.58 slice from Illustris and select galaxies with stellar masses greater than 108 M⊙. The two-point correlation function of these simulated galaxies follows a single power law, ξ(r) ˜ r-1.5. Then, we generate Lévy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point distributions have the same abundance and two-point correlation function, their spatial distributions are very different; most prominently, filamentary structures, absent in Lévy fractals. To quantify these missing topologies, we adopt network analysis tools and measure diameter, giant component, and transitivity from networks built by a conventional friends-of-friends recipe with various linking lengths. Unlike the abundance and two-point correlation function, these network quantities reveal a clear separation between the two simulated distributions; therefore, the galaxy distribution simulated by Illustris is not a Lévy fractal quantitatively. We find that the described network quantities offer an efficient tool for discriminating topologies and for comparing observed and theoretical distributions.

  7. On the Correct Analysis of the Maxwell Distribution

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2006-04-01

    The critical analysis of the Maxwell distribution is proposed. The main results of the analysis are as follows. (1) As is known, an experimental device for studying the Maxwell distribution consists of the following basic physical subsystems: (a) ideal molecular gas enclosed in a vessel (gas is in the equilibrium state); (b) molecule beam which is emitted from the small aperture of the vessel (the small aperture is a stochastic source of quantum particles). (2) The energy of the molecule of the beam does not represent random quantity, since molecules does not collide with each other. In this case, only the set of the monoenergetic molecules emitted by the stochastic source is a random quantity. This set is called a quantum gas. The probability pk that the quantum gas has the energy Enk is given by the Gibbs quantum canonical distribution: pk=p0,,-Enk / Enk T) . - T), k=0,;1,; where k is the number of molecules with energy En; T is temperature of the molecule in the vessel. (3) The average number of the molecules with energyEn represents the Planck distribution function: f=∑k=0^∞kpk ≡f(Planck). (4) In classical case, the expression Enf(Planck) represents the Maxwell distribution function: f(Maxwell)˜En,(Planck)˜v^2,;(-mv^2 / mv^2 2T) . - 2T). Consequently, the generally accepted statement that the Maxwell distribution function describes gas enclosed in a vessel is a logical error.

  8. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    NASA Astrophysics Data System (ADS)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  9. Inverse Analysis of Distributed Load Using Strain Data

    NASA Astrophysics Data System (ADS)

    Nakamura, Toshiya; Igawa, Hirotaka

    The operational stress data is quite useful in managing the structural integrity and airworthiness of an aircraft. Since the aerodynamic load (pressure) distributes continuously on the structure surface, identifying the load from finite number of measured strain data is not easy. Although this is an inverse problem, usually used is an empirical correlation between load and strain obtained through expensive ground tests. Some analytical studies have been conducted but simple mathematical expressions were assumed to approximate the pressure distribution. In the present study a more flexible approximation of continuous load distribution is proposed. The pressure distribution is identified based on finite number of strain data with using the conventional finite element method and pseudo-inverse matrix. Also an extension is made by coupling an aerodynamical restriction with the elastic equation. Numerical examples show that this extension improves the precision of the inverse analysis with very small number of strain data.

  10. Rapid Analysis of Mass Distribution of Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Zapp, Edward

    2007-01-01

    Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

  11. Global NLO Analysis of Nuclear Parton Distribution Functions

    SciTech Connect

    Hirai, M.; Kumano, S.; Nagai, T.-H.

    2008-02-21

    Nuclear parton distribution functions (NPDFs) are determined by a global analysis of experimental measurements on structure-function ratios F{sub 2}{sup A}/F{sub 2}{sup A{sup '}} and Drell-Yan cross section ratios {sigma}{sub DY}{sup A}/{sigma}{sub DY}{sup A{sup '}}, and their uncertainties are estimated by the Hessian method. The NPDFs are obtained in both leading order (LO) and next-to-leading order (NLO) of {alpha}{sub s}. As a result, valence-quark distributions are relatively well determined, whereas antiquark distributions at x>0.2 and gluon distributions in the whole x region have large uncertainties. The NLO uncertainties are slightly smaller than the LO ones; however, such a NLO improvement is not as significant as the nucleonic case.

  12. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  13. GIS-based poverty and population distribution analysis in China

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  14. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html. PMID:24254576

  15. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  16. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  17. WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

    EPA Science Inventory

    The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

  18. Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?

    NASA Technical Reports Server (NTRS)

    Kanefsky, B.; Barlow, N. G.; Gulick, V. C.

    2001-01-01

    We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.

  19. Data synthesis and display programs for wave distribution function analysis

    NASA Technical Reports Server (NTRS)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  20. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions. PMID:22767866

  1. Assessing tephra total grain-size distribution: Insights from field data analysis

    NASA Astrophysics Data System (ADS)

    Costa, A.; Pioli, L.; Bonadonna, C.

    2016-06-01

    The Total Grain-Size Distribution (TGSD) of tephra deposits is crucial for hazard assessment and provides fundamental insights into eruption dynamics. It controls both the mass distribution within the eruptive plume and the sedimentation processes and can provide essential information on the fragmentation mechanisms. TGSD is typically calculated by integrating deposit grain-size at different locations. The result of such integration is affected not only by the number, but also by the spatial distribution and distance from the vent of the sampling sites. In order to evaluate the reliability of TGSDs, we assessed representative sampling distances for pyroclasts of different sizes through dedicated numerical simulations of tephra dispersal. Results reveal that, depending on wind conditions, a representative grain-size distribution of tephra deposits down to ∼100 μm can be obtained by integrating samples collected at distances from less than one tenth up to a few tens of the column height. The statistical properties of TGSDs representative of a range of eruption styles were calculated by fitting the data with a few general distributions given by the sum of two log-normal distributions (bi-Gaussian in Φ-units), the sum of two Weibull distributions, and a generalized log-logistic distribution for the cumulative number distributions. The main parameters of the bi-lognormal fitting correlate with height of the eruptive columns and magma viscosity, allowing general relationships to be used for estimating TGSD generated in a variety of eruptive styles and for different magma compositions. Fitting results of the cumulative number distribution show two different power law trends for coarse and fine fractions of tephra particles, respectively. Our results shed light on the complex processes that control the size of particles being injected into the atmosphere during volcanic explosive eruptions and represent the first attempt to assess TGSD on the basis of pivotal physical

  2. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    PubMed

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  3. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    PubMed Central

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  4. Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers

    PubMed Central

    Markiewicz, Iwona; Strupczewski, Witold G.; Bogdanowicz, Ewa; Kochanek, Krzysztof

    2015-01-01

    Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles. PMID:26657239

  5. Integrating software architectures for distributed simulations and simulation analysis communities.

    SciTech Connect

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  6. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  7. Spatial Distribution Analysis of Scrub Typhus in Korea

    PubMed Central

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does not directly affect the incidence rate. Conclusion: GIS analysis shows the spatial characteristics of scrub typhus. This research can be used to construct a spatial-temporal model to understand the epidemic tsutsugamushi. PMID:24159523

  8. Local structure studies of materials using pair distribution function analysis

    NASA Astrophysics Data System (ADS)

    Peterson, Joseph W.

    A collection of pair distribution function studies on various materials is presented in this dissertation. In each case, local structure information of interest pushes the current limits of what these studies can accomplish. The goal is to provide insight into the individual material behaviors as well as to investigate ways to expand the current limits of PDF analysis. Where possible, I provide a framework for how PDF analysis might be applied to a wider set of material phenomena. Throughout the dissertation, I discuss 0 the capabilities of the PDF method to provide information pertaining to a material's structure and properties, ii) current limitations in the conventional approach to PDF analysis, iii) possible solutions to overcome certain limitations in PDF analysis, and iv) suggestions for future work to expand and improve the capabilities PDF analysis.

  9. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    NASA Astrophysics Data System (ADS)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  10. Analysis of georadar data to estimate the snow depth distribution

    NASA Astrophysics Data System (ADS)

    Godio, A.; Rege, R. B.

    2016-06-01

    We have performed extensive georadar surveys for mapping the snow depth in the basin of Breuil-Cervinia (Aosta Valley) in the Italian Alps, close to the Matterhorn. More than 9 km of georadar profiles were acquired in April 2008 and 15 km in April 2009, distributed on an hydrological basin of about 12 km2. Radar surveys were carried out partially on the iced area of Ventina glacier at elevation higher than 3000 m a.s.l. and partially at lower elevation (2500 m-3000 m) on the gently slopes of the basin where the winter snow accumulated directly on the ground surface. The snow distribution on the basin, at the end of the season, could vary significantly according to the elevation range, exposition and ground morphology. In small catchment the snow depth reached 6-7 m. At higher elevation, on the glacier, a more homogeneous distribution is usually observed. A descriptive statistical analysis of the dataset is discussed to demonstrate the high spatial variability of the snow depth distribution in the area. The probability distribution of the snow depth fits the gamma distribution with a good correlation. Instead we didn't found any satisfactory relationship of the snow depth with the main morphological parameters of the terrain (elevation, slope, curvature). This suggests that the snow distribution, at the end of the winter season, is mainly conditioned by the transport phenomena and re-distribution of the wind action. The comparison of the results of georadar surveys with the hand probe measurements points out the low accuracy of the snow depth estimate in the area by using conventional hand probing approach only, encouraging to develop technology for fast and accurate mapping of the snow depth at the scale of basin.

  11. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  12. New acquisition techniques and statistical analysis of bubble size distributions

    NASA Astrophysics Data System (ADS)

    Proussevitch, A.; Sahagian, D.

    2005-12-01

    Various approaches have been taken to solve the long-standing problem of determining size distributions of objects embedded in an opaque medium. In the case of vesicles in volcanic rocks, the most reliable technique is 3-D imagery by computed X-Ray tomography. However, this method is expensive, requires intensive computational resources and thus limited and not always available for an investigator. As a cheaper alternative, 2-D cross-sectional data is commonly available, but requires stereological analysis for 3-D conversion. A stereology technique for spherical bubbles is quite robust but elongated non-spherical bubbles require complicated conversion approaches and large observed populations. We have revised computational schemes of applying non-spherical stereology for practical analysis of bubble size distributions. The basic idea of this new approach is to exclude from the conversion those classes (bins) of non-spherical bubbles that provide a larger cross-section probability distribution than a maximum value which depends on mean aspect ratio. Thus, in contrast to traditional stereological techniques, larger bubbles are "predicted" from the rest of the population. As a proof of principle, we have compared distributions so obtained with direct 3-D imagery (X-Ray tomography) for non-spherical bubbles from the same samples of vesicular basalts collected from the Colorado Plateau. The results of the comparison demonstrate that in cases where x-ray tomography is impractical, stereology can be used with reasonable reliability, even for non-spherical vesicles.

  13. Comparing distributions of environmental outcomes for regulatory environmental justice analysis.

    PubMed

    Maguire, Kelly; Sheriff, Glenn

    2011-05-01

    Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146

  14. Modeling and convergence analysis of distributed coevolutionary algorithms.

    PubMed

    Subbu, Raj; Sanderson, Arthur C

    2004-04-01

    A theoretical foundation is presented for modeling and convergence analysis of a class of distributed coevolutionary algorithms applied to optimization problems in which the variables are partitioned among p nodes. An evolutionary algorithm at each of the p nodes performs a local evolutionary search based on its own set of primary variables, and the secondary variable set at each node is clamped during this phase. An infrequent intercommunication between the nodes updates the secondary variables at each node. The local search and intercommunication phases alternate, resulting in a cooperative search by the p nodes. First, we specify a theoretical basis for a class of centralized evolutionary algorithms in terms of construction and evolution of sampling distributions over the feasible space. Next, this foundation is extended to develop a model for a class of distributed coevolutionary algorithms. Convergence and convergence rate analyzes are pursued for basic classes of objective functions. Our theoretical investigation reveals that for certain unimodal and multimodal objectives, we can expect these algorithms to converge at a geometrical rate. The distributed coevolutionary algorithms are of most interest from the perspective of their performance advantage compared to centralized algorithms, when they execute in a network environment with significant local access and internode communication delays. The relative performance of these algorithms is therefore evaluated in a distributed environment with realistic parameters of network behavior. PMID:15376831

  15. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  16. Electrical Power Distribution and Control Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

    2001-01-01

    This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

  17. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  18. Distributed and interactive visual analysis of omics data.

    PubMed

    Farag, Yehia; Berven, Frode S; Jonassen, Inge; Petersen, Kjell; Barsnes, Harald

    2015-11-01

    The amount of publicly shared proteomics data has grown exponentially over the last decade as the solutions for sharing and storing the data have improved. However, the use of the data is often limited by the manner of which it is made available. There are two main approaches: download and inspect the proteomics data locally, or interact with the data via one or more web pages. The first is limited by having to download the data and thus requires local computational skills and resources, while the latter most often is limited in terms of interactivity and the analysis options available. A solution is to develop web-based systems supporting distributed and fully interactive visual analysis of proteomics data. The use of a distributed architecture makes it possible to perform the computational analysis at the server, while the results of the analysis can be displayed via a web browser without the need to download the whole dataset. Here the challenges related to developing such systems for omics data will be discussed. Especially how this allows for multiple connected interactive visual displays of omics dataset in a web-based setting, and the benefits this provide for computational analysis of proteomics data.This article is part of a Special Issue entitled: Computational Proteomics. PMID:26047716

  19. Human leptospirosis distribution pattern analysis in Hulu Langat, Selangor

    NASA Astrophysics Data System (ADS)

    Zulkifli, Zuhafiza; Shariff, Abdul Rashid Mohamed; Tarmidi, Zakri M.

    2016-06-01

    This paper discussed the distribution pattern of human leptospirosis in the Hulu Langat District, Selangor, Malaysia. The data used in this study is leptospirosis cases’ report, and spatial boundaries. Leptospirosis cases, data were collected from Health Office of Hulu Langat and spatial boundaries, including lot and district boundaries was collected from the Department of Mapping and Surveying Malaysia (JUPEM). A total of 599 leptospirosis cases were reported in 2013, and this data was mapped based on the addresses provided in the leptospirosis cases’ report. This study uses three statistical methods to analyze the distribution pattern; Moran's I, average nearest neighborhood (ANN) and kernel density estimation. The analysis was used to determine the spatial distribution and the average distance of leptospirosis cases and located the hotspot locations. Using Moran's I analysis, results indicated the cases were random, with a value of -0.202816 which show negative spatial autocorrelation exist among leptospirosis cases. The ANN analysis result, indicated the cases are in cluster pattern, with value of the average nearest neighbor ratio is -21.80. And results also show the hotspots are has been identified and mapped in the Hulu Langat District.

  20. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  1. GPS FOM Chimney Analysis using Generalized Extreme Value Distribution

    NASA Technical Reports Server (NTRS)

    Ott, Rick; Frisbee, Joe; Saha, Kanan

    2004-01-01

    Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.

  2. Mechanical Network in Titin Immunoglobulin from Force Distribution Analysis

    PubMed Central

    Wilmanns, Matthias; Gräter, Frauke

    2009-01-01

    The role of mechanical force in cellular processes is increasingly revealed by single molecule experiments and simulations of force-induced transitions in proteins. How the applied force propagates within proteins determines their mechanical behavior yet remains largely unknown. We present a new method based on molecular dynamics simulations to disclose the distribution of strain in protein structures, here for the newly determined high-resolution crystal structure of I27, a titin immunoglobulin (IG) domain. We obtain a sparse, spatially connected, and highly anisotropic mechanical network. This allows us to detect load-bearing motifs composed of interstrand hydrogen bonds and hydrophobic core interactions, including parts distal to the site to which force was applied. The role of the force distribution pattern for mechanical stability is tested by in silico unfolding of I27 mutants. We then compare the observed force pattern to the sparse network of coevolved residues found in this family. We find a remarkable overlap, suggesting the force distribution to reflect constraints for the evolutionary design of mechanical resistance in the IG family. The force distribution analysis provides a molecular interpretation of coevolution and opens the road to the study of the mechanism of signal propagation in proteins in general. PMID:19282960

  3. Iterative Monte Carlo analysis of spin-dependent parton distributions

    NASA Astrophysics Data System (ADS)

    Sato, Nobuo; Melnitchouk, W.; Kuhn, S. E.; Ethier, J. J.; Accardi, A.; Jefferson Lab Angular Momentum Collaboration

    2016-04-01

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳0.1 . The study also provides the first determination of the flavor-separated twist-3 PDFs and the d2 moment of the nucleon within a global PDF analysis.

  4. Componential distribution analysis of food using near infrared ray image

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  5. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  6. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    PubMed

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market. PMID:25095276

  7. Distributed analysis environment for HEP and interdisciplinary applications

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.

    2003-04-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project ( http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results.

  8. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  9. Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

    NASA Astrophysics Data System (ADS)

    Yang, J.; Castelli, F.; Chen, Y.

    2014-10-01

    Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ɛ-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more

  10. Numerical analysis of decoy state quantum key distribution protocols

    SciTech Connect

    Harrington, Jim W; Rice, Patrick R

    2008-01-01

    Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

  11. Efficient network meta-analysis: a confidence distribution approach*

    PubMed Central

    Yang, Guang; Liu, Dungang; Liu, Regina Y.; Xie, Minge; Hoaglin, David C.

    2014-01-01

    Summary Network meta-analysis synthesizes several studies of multiple treatment comparisons to simultaneously provide inference for all treatments in the network. It can often strengthen inference on pairwise comparisons by borrowing evidence from other comparisons in the network. Current network meta-analysis approaches are derived from either conventional pairwise meta-analysis or hierarchical Bayesian methods. This paper introduces a new approach for network meta-analysis by combining confidence distributions (CDs). Instead of combining point estimators from individual studies in the conventional approach, the new approach combines CDs which contain richer information than point estimators and thus achieves greater efficiency in its inference. The proposed CD approach can e ciently integrate all studies in the network and provide inference for all treatments even when individual studies contain only comparisons of subsets of the treatments. Through numerical studies with real and simulated data sets, the proposed approach is shown to outperform or at least equal the traditional pairwise meta-analysis and a commonly used Bayesian hierarchical model. Although the Bayesian approach may yield comparable results with a suitably chosen prior, it is highly sensitive to the choice of priors (especially the prior of the between-trial covariance structure), which is often subjective. The CD approach is a general frequentist approach and is prior-free. Moreover, it can always provide a proper inference for all the treatment effects regardless of the between-trial covariance structure. PMID:25067933

  12. Application of Wigner distribution function for analysis of radio occultations

    NASA Astrophysics Data System (ADS)

    Gorbunov, M. E.; Lauritsen, K. B.; Leroy, S. S.

    2010-12-01

    We present the Wigner distribution function (WDF) as an alternative to radio holographic (RH) analysis in the interpretation of radio occultation (RO) observations of the Earth's atmosphere. RH analysis is widely used in RO retrieval to isolate signal from noise and to identify atmospheric multipath. The same task is performed by WDF which also maps a 1-D wave function to 2-D time-frequency phase space and which has maxima located at the ray manifold. Unlike the standard RH technique based on the spectrum analysis in small sliding apertures, WDF is given by a global integral transform, which allows for a higher resolution. We present a tomographic derivation of the WDF and discuss its properties. Examples of analysis of simulations and COSMIC RO data show that WDF allows for a much sharper localization of the details of bending angle profiles as compared to the standard RH analysis in sliding apertures. Both WDF and RH allow for identification of multivalued bending angle profiles arising in the presence of strong horizontal gradients and may introduce a negative bias into bending angle retrieval.

  13. Cost-benefit analysis of potassium iodide distribution programs

    SciTech Connect

    Aldrich, D.C.

    1982-01-01

    An analysis has been performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident source terms, accident probabilities and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated.

  14. Circularly symmetric distributed feedback semiconductor laser: An analysis

    SciTech Connect

    Erdogan, T.; Hall, D.G.

    1990-08-15

    We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describes the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser-it should have a superb quality output beam and is well-suited for array operation.

  15. Circularly symmetric distributed feedback semiconductor laser: An analysis

    SciTech Connect

    Erdogan, T.; Hall, D.G. )

    1990-08-15

    We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describe the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find that the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser---it should have a superb quality output beam and is well-suited for array operation.

  16. Data intensive high energy physics analysis in a distributed cloud

    NASA Astrophysics Data System (ADS)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  17. Preliminary analysis of hub and spoke air freight distribution system

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1978-01-01

    A brief analysis is made of the hub and spoke air freight distribution system which would employ less than 15 hub centers world wide with very large advanced distributed-load freighters providing the line-haul delivery between hubs. This system is compared to a more conventional network using conventionally-designed long-haul freighters which travel between numerous major airports. The analysis calculates all of the transportation costs, including handling charges and pickup and delivery costs. The results show that the economics of the hub/spoke system are severely compromised by the extensive use of feeder aircraft to deliver cargo into and from the large freighter terminals. Not only are the higher costs for the smaller feeder airplanes disadvantageous, but their use implies an additional exchange of cargo between modes compared to truck delivery. The conventional system uses far fewer feeder airplanes, and in many cases, none at all. When feeder aircraft are eliminated from the hub/spoke system, however, that system is universally more economical than any conventional system employing smaller line-haul aircraft.

  18. A theoretical analysis of basin-scale groundwater temperature distribution

    NASA Astrophysics Data System (ADS)

    An, Ran; Jiang, Xiao-Wei; Wang, Jun-Zhi; Wan, Li; Wang, Xu-Sheng; Li, Hailong

    2015-03-01

    The theory of regional groundwater flow is critical for explaining heat transport by moving groundwater in basins. Domenico and Palciauskas's (1973) pioneering study on convective heat transport in a simple basin assumed that convection has a small influence on redistributing groundwater temperature. Moreover, there has been no research focused on the temperature distribution around stagnation zones among flow systems. In this paper, the temperature distribution in the simple basin is reexamined and that in a complex basin with nested flow systems is explored. In both basins, compared to the temperature distribution due to conduction, convection leads to a lower temperature in most parts of the basin except for a small part near the discharge area. There is a high-temperature anomaly around the basin-bottom stagnation point where two flow systems converge due to a low degree of convection and a long travel distance, but there is no anomaly around the basin-bottom stagnation point where two flow systems diverge. In the complex basin, there are also high-temperature anomalies around internal stagnation points. Temperature around internal stagnation points could be very high when they are close to the basin bottom, for example, due to the small permeability anisotropy ratio. The temperature distribution revealed in this study could be valuable when using heat as a tracer to identify the pattern of groundwater flow in large-scale basins. Domenico PA, Palciauskas VV (1973) Theoretical analysis of forced convective heat transfer in regional groundwater flow. Geological Society of America Bulletin 84:3803-3814

  19. Evaluation of Distribution Analysis Software for DER Applications

    SciTech Connect

    Staunton, RH

    2003-01-23

    providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of this modeling effort.

  20. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  1. Phylogenetic analysis reveals a scattered distribution of autumn colours

    PubMed Central

    Archetti, Marco

    2009-01-01

    Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636

  2. Silk Fiber Mechanics from Multiscale Force Distribution Analysis

    PubMed Central

    Cetinkaya, Murat; Xiao, Senbo; Markert, Bernd; Stacklies, Wolfram; Gräter, Frauke

    2011-01-01

    Here we decipher the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach incorporates molecular dynamics and finite element simulations. Therefore, the approach allows the analysis of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular and continuum mechanics. We thereby dissect the contributions from the nanoscale building blocks, the soft amorphous and the strong crystalline subunits, to silk fiber mechanics. We identify the amorphous subunits not only to give rise to high elasticity, but to also ensure efficient stress homogenization through the friction between entangled chains, which also allows the crystals to withstand stresses as high as 2 GPa in the context of the amorphous matrix. We show that the maximal toughness of silk is achieved at 10–40% crystallinity depending on the distribution of crystals in the fiber. We also determined a serial arrangement of the crystalline and amorphous subunits in lamellae to outperform a random or a parallel arrangement, putting forward what we believe to be a new structural model for silk and other semicrystalline materials. The multiscale approach, not requiring any empirical parameters, is applicable to other partially ordered polymeric systems. Hence, it is an efficient tool for the design of artificial silk fibers. PMID:21354403

  3. Lacunarity and multifractal analysis of the large DLA mass distribution

    NASA Astrophysics Data System (ADS)

    Rodriguez-Romo, Suemi; Sosa-Herrera, Antonio

    2013-08-01

    We show the methodology used to analyze fractal and mass-multifractal properties of very large Diffusion-Limited Aggregation (DLA) clusters with a maximum of 109 particles for 2D aggregates and 108 particles for 3D clusters, to support our main result; the scaling behavior obtained by our experimental results corresponds to the expected performance of monofractal objects. In order to estimate lacunarity measures for large DLA clusters, we develop a variant of the gliding-box algorithm which reduces the computer time needed to obtain experimental results. We show how our mass multifractal data have a tendency to present monofractal behavior for the mass distribution of the cases presented in this paper in the limit of very large clusters. Lacunarity analysis shows, provided we study small clusters mass distributions, data which might be interpreted as two different values of fractal dimensions while the cluster grows; however, this effect tends to vanish when the cluster size increases further, in such a way that monofractality is achieved. The outcomes of this paper lead us to conclude that the previously reported mass multifractality behavior (Vicsek et al., 1990 [13]) detected for DLA clusters is a consequence of finite size effects and floating point precision limitations and not an intrinsic feature of the phenomena, since the scaling behavior of our DLA clusters space corresponds to monofractal objects, being this situation remarkably noticeable in the limit of very large clusters.

  4. A Distributed Flocking Approach for Information Stream Clustering Analysis

    SciTech Connect

    Cui, Xiaohui; Potok, Thomas E

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  5. SATMC: Spectral energy distribution Analysis Through Markov Chains

    NASA Astrophysics Data System (ADS)

    Johnson, S. P.; Wilson, G. W.; Tang, Y.; Scott, K. S.

    2013-12-01

    We present the general purpose spectral energy distribution (SED) fitting tool SED Analysis Through Markov Chains (SATMC). Utilizing Monte Carlo Markov Chain (MCMC) algorithms, SATMC fits an observed SED to SED templates or models of the user's choice to infer intrinsic parameters, generate confidence levels and produce the posterior parameter distribution. Here, we describe the key features of SATMC from the underlying MCMC engine to specific features for handling SED fitting. We detail several test cases of SATMC, comparing results obtained from traditional least-squares methods, which highlight its accuracy, robustness and wide range of possible applications. We also present a sample of submillimetre galaxies (SMGs) that have been fitted using the SED synthesis routine GRASIL as input. In general, these SMGs are shown to occupy a large volume of parameter space, particularly in regards to their star formation rates which range from ˜30 to 3000 M⊙ yr-1 and stellar masses which range from ˜1010 to 1012 M⊙. Taking advantage of the Bayesian formalism inherent to SATMC, we also show how the fitting results may change under different parametrizations (i.e. different initial mass functions) and through additional or improved photometry, the latter being crucial to the study of high-redshift galaxies.

  6. One-Dimensional Analysis Techniques for Pulsed Blowing Distribution

    NASA Astrophysics Data System (ADS)

    Chambers, Frank

    2005-11-01

    Pulsed blowing offers reductions in bleed air requirements for aircraft flow control. Efficient pulsed blowing systems require careful design to minimize bleed air use while distributing blowing to multiple locations. Pulsed blowing systems start with a steady flow supply and process it to generate a pulsatile flow. The fluid-acoustic dynamics of the system play an important role in overall effectiveness. One-dimensional analysis techniques that in the past have been applied to ventilation systems and internal combustion engines have been adapted to pulsed blowing. Pressure wave superposition and reflection are used with the governing equations of continuity, momentum and energy to determine particle velocities and pressures through the flow field. Simulations have been performed to find changes in the amplitude and wave shape as pulses are transmitted through a simple pulsed blowing system. A general-purpose code is being developed to simulate wave transmission and allow the determination of blowing system dynamic parameters.

  7. Phylogenetic analysis on the soil bacteria distributed in karst forest

    PubMed Central

    Zhou, JunPei; Huang, Ying; Mo, MingHe

    2009-01-01

    Phylogenetic composition of bacterial community in soil of a karst forest was analyzed by culture-independent molecular approach. The bacterial 16S rRNA gene was amplified directly from soil DNA and cloned to generate a library. After screening the clone library by RFLP, 16S rRNA genes of representative clones were sequenced and the bacterial community was analyzed phylogenetically. The 16S rRNA gene inserts of 190 clones randomly selected were analyzed by RFLP and generated 126 different RFLP types. After sequencing, 126 non-chimeric sequences were obtained, generating 113 phylotypes. Phylogenetic analysis revealed that the bacteria distributed in soil of the karst forest included the members assigning into Proteobacteria, Acidobacteria, Planctomycetes, Chloroflexi (Green nonsulfur bacteria), Bacteroidetes, Verrucomicrobia, Nitrospirae, Actinobacteria (High G+C Gram-positive bacteria), Firmicutes (Low G+C Gram-positive bacteria) and candidate divisions (including the SPAM and GN08). PMID:24031430

  8. Effects of specimen size on the flexural strength and Weibull modulus of nuclear graphite IG-110, NBG-18, and PCEA

    NASA Astrophysics Data System (ADS)

    Chi, Se-Hwan

    2015-09-01

    Changes in flexural strength and Weibull modulus due to specimen size were investigated for three nuclear graphite grades, IG-110, NBG-18, and PCEA, using four-point-1/3 point (4-1/3) loading with specimens of three different sizes: 3.18 (Thickness) × 6.35 (Width) × 50.8 (Length), 6.50 (T) × 12.0 (W) × 52.0 (L), 18.0 (T) × 16.0 (W) × 64 (L) (mm) (total: 210 specimens). Results showed some specimen size effects were grade dependent: While NBG-18 (a) showed rather significant specimen size effects (37% difference between the 3 T and 18 T), the differences in IG-110 and PCEA were 7.6-15%. The maximum differences in flexural strength due to specimen size were larger in the PCEA and NBG-18 having larger sized coke particles (medium grain size: >300 μm) than the IG-110 with super fine coke particle size (25 μm). The Weibull modulus showed a data population dependency, in that it decreased with increasing numbers of data used for modulus determination. A good correlation between the fracture surface roughness and the flexural strength was confirmed.

  9. Conductance Distributions for Empirical Orthogonal Function Analysis and Optimal Interpolation

    NASA Astrophysics Data System (ADS)

    Knipp, Delores; McGranaghan, Ryan; Matsuo, Tomoko

    2016-04-01

    We show the first characterizations of the primary modes of ionospheric Hall and Pedersen conductance variability as empirical orthogonal functions (EOFs). These are derived from six satellite years of Defense Meteorological Satellite Program (DMSP) particle data acquired during the rise of solar cycles 22 and 24. The 60 million DMSP spectra were each processed through the Global Airlglow Model. This is the first large-scale analysis of ionospheric conductances completely free of assumption of the incident electron energy spectra. We show that the mean patterns and first four EOFs capture ˜50.1 and 52.9% of the total Pedersen and Hall conductance variabilities, respectively. The mean patterns and first EOFs are consistent with typical diffuse auroral oval structures and quiet time strengthening/weakening of the mean pattern. The second and third EOFs show major disturbance features of magnetosphere-ionosphere (MI) interactions: geomagnetically induced auroral zone expansion in EOF2 and the auroral substorm current wedge in EOF3. The fourth EOFs suggest diminished conductance associated with ionospheric substorm recovery mode. These EOFs are then used in a new optimal interpolation (OI) technique to estimate complete high-latitude ionospheric conductance distributions. The technique combines particle precipitation-based calculations of ionospheric conductances and their errors with a background model and its error covariance (estimated by EOF analysis) to infer complete distributions of the high-latitude ionospheric conductances for a week in late 2011. The OI technique captures: 1) smaller-scaler ionospheric conductance features associated with discrete precipitation and 2) brings ground- and space-based data into closer agreement. We show quantitatively and qualitatively that this new technique provides better ionospheric conductance specification than past statistical models, especially during heightened geomagnetic activity.

  10. Prediction of the Inert Strength Distribution of Si3N4 Diesel Valves

    SciTech Connect

    Andrews, M.J.; Breder, K.; Wereszczak, A.A.

    1999-01-25

    Censored Weibull strength distributions were generated with NT551 silicon nitride four-point flexure data using the ASTM C1161-B and 5.0 mm diameter cylindrical specimens. Utilizing finite element models and AlliedSignal's life prediction codes, the inert or fast fracture strength failure probability of a ceramic diesel valve was estimated from these data sets. The failure probability prediction derived from each data set were found to be more conservative than valve strength data. Fractographic analysis of the test specimens and valves showed that the cylindrical specimens failed from a different flaw population than the prismatic flexure bars and the valves. The study emphasizes the prerequisite of having coincident flaw populations homogeneously distributed in both the test specimen and the ceramic component. Lastly, it suggests that unless material homogeneity exists, that any meaningful life prediction or reliability analysis of a component may not be possible.