For comprehensive and current results, perform a real-time search at Science.gov.

1

/q-exponential, Weibull, and /q-Weibull distributions: an empirical analysis

NASA Astrophysics Data System (ADS)

In a comparative study, the q-exponential and Weibull distributions are employed to investigate frequency distributions of basketball baskets, cyclone victims, brand-name drugs by retail sales, and highway length. In order to analyze the intermediate cases, a distribution, the q-Weibull one, which interpolates the q-exponential and Weibull ones, is introduced. It is verified that the basketball baskets distribution is well described by a q-exponential, whereas the cyclone victims and brand-name drugs by retail sales ones are better adjusted by a Weibull distribution. On the other hand, for highway length the q-exponential and Weibull distributions do not give satisfactory adjustment, being necessary to employ the q-Weibull distribution. Furthermore, the introduction of this interpolating distribution gives an illumination from the point of view of the stretched exponential against inverse power law ( q-exponential with q>1) controversy.

Picoli, S.; Mendes, R. S.; Malacarne, L. C.

2003-06-01

2

Using Weibull Distribution Analysis to Evaluate ALARA Performance

As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

2009-10-01

3

Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

NASA Technical Reports Server (NTRS)

Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

1992-01-01

4

Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

NASA Technical Reports Server (NTRS)

Described here are nonlinear regression estimators for the three-parameter Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

1992-01-01

5

Modern estimation of the parameters of the Weibull wind speed distribution for wind energy analysis

Three methods for calculating the parameters of the Weibull wind speed distribution for wind energy analysis are presented: the maximum likelihood method, the proposed modified maximum likelihood method, and the commonly used graphical method. The application of each method is demonstrated using a sample wind speed data set, and a comparison of the accuracy of each method is also performed.

J. V. Seguro; T. W. Lambert

2000-01-01

6

NASA Technical Reports Server (NTRS)

The suitability of the two-parameter Weibull distribution for describing highly censored cat motion sickness latency data was evaluated by estimating the parameters with the maximum likelihood method and testing for goodness of fit with the Kolmogorov-Smirnov statistic. A procedure for determining confidence levels and testing for significance of the difference between Weibull parameters is described. Computer programs for these procedures may be obtained from an archival source.

Park, Won J.; Crampton, George H.

1988-01-01

7

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

Krantz, Timothy L.

2002-01-01

8

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

Kranz, Timothy L.

2002-01-01

9

Estimation problems associated with the Weibull distribution

Series in descending powers of the sample size are developed for the moments of the coefficient of variation v* for the Weibull distribution F(t) = 1 -exp(-(t/b)/sup c/). A similar series for the moments of the estimator c* of the shape parameter c are derived from these. Comparisons are made with basic asymptotic assessments for the means and variances. From the first four moments, approximations are given to the distribution of v* and c*. In addition, an almost unbiased estimator of c is given when a sample is provided with the value of v*. Comments are given on the validity of the asymptotically normal assessments of the distributions.

Bowman, K O; Shenton, L R

1981-09-01

10

Weibull model of Multiplicity Distribution in hadron-hadron collisions

We introduce the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes involving fragmentation processes. This gives a natural connection to the available state-of-the-art models for multi-particle production in hadron hadron collisions involving QCD parton fragmentation and hadronization.

Sadhana Dash; Basanta K. Nandi

2014-09-19

11

A wind energy analysis of Grenada: an estimation using the ‘Weibull’ density function

The Weibull density function has been used to estimate the wind energy potential in Grenada, West Indies. Based on historic recordings of mean hourly wind velocity this analysis shows the importance to incorporate the variation in wind energy potential during diurnal cycles. Wind energy assessments that are based on Weibull distribution using average daily\\/seasonal wind speeds fail to acknowledge that

D Weisser

2003-01-01

12

Weibull Analysis of Mechanical Data for Castings II: Weibull Mixtures and Their Interpretation

NASA Astrophysics Data System (ADS)

The interpretation of Weibull probability plots of mechanical testing data from castings was discussed in Part 1 (M. Tiryakio?lu, J. Campbell: Metall. Mater. Trans. A, 41 (2010) 3121-3129). In Part II, details about the mathematical models of Weibull mixtures are introduced. The links between the occurrence of Weibull mixtures and casting process parameters are discussed. Worked examples are introduced in five case studies in which six datasets from the literature were reanalyzed. Results show that tensile and fatigue life data should be interpreted differently. In tensile data, Weibull mixtures are due to two distinct defect distributions, namely "old" and "young" bifilms, which are a result of prior processing and mold filling, respectively. "Old" bifilms are the predominant defect and result in the lower distribution, whereas "young" bifilms results on the upper distribution. In fatigue life data, Weibull mixtures are due to two failure mechanisms being active: failure due to cracks initiating from surface defects and interior defects. Surface defects are predominant and interior defects lead to fatigue failure only when there are no cracks initiated by surface defects. In all cases, only the mutually exclusive Weibull mixture model was found to be applicable.

Tiryakio?lu, Murat

2015-01-01

13

Weibull distribution based on maximum likelihood with interval inspection data

NASA Technical Reports Server (NTRS)

The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

Rheinfurth, M. H.

1985-01-01

14

An EOQ Model for Items with Weibull Distribution Deterioration

An inventory model is considered for deteriorating items with a variable rate of deterioration, where deterioration means decay, damage or spoilage such that the item cannot be used for its original purpose. Specifically, the Weibull distribution is used to represent the distribution of the time to deterioration. The EOQ formula is derived under conditions of constant demand, instantaneous delivery and

Richard P. Covert; George C. Philip

1973-01-01

15

Investigation of Weibull statistics in fracture analysis of cast aluminum

NASA Technical Reports Server (NTRS)

The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

Holland, Frederic A., Jr.; Zaretsky, Erwin V.

1989-01-01

16

NASA Technical Reports Server (NTRS)

A Bayesian inference process for system logistical planning is presented which provides a method for incorporating actual failures with prediction data for an ongoing and improving reliability estimates. The process uses the Weibull distribution, and provides a means for examining and updating logistical and maintenance support needs.

Giuntini, Michael E.; Giuntini, Ronald E.

1991-01-01

17

Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

NASA Technical Reports Server (NTRS)

The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

Huang, Zhaofeng; Porter, Albert A.

1990-01-01

18

Evaluation of Methods to Predict Weibull Parameters for Characterizing Diameter Distributions

Evaluation of Methods to Predict Weibull Parameters for Characterizing Diameter Distributions developed and evaluated different methods of predicting parameters of Weibull distribution to characterize classes instead of individual trees as in the CDFR approach. These methods were evaluated based on four

Cao, Quang V.

19

Weibull analysis applied to the pull adhesion test and fracture of a metal-ceramic interface

Various adhesion tests have been developed to measure the mechanical bonding of thin coatings deposited on substrates. In the pull test, pins that have been bonded to the coating under test are pulled with increasing force normal to the coating until the coating is pulled from the substrate. For many systems, large scatter in the data is often observed due to uncontrolled defects in the interface and the brittle nature of the pull test. In this study, the applicability of Weibull statistics to the analysis of adhesion of Ag films to vacuum sputter-cleaned zirconia was examined. Data were obtained for smooth and rough substrates for various levels of adhesion. A good fit of the data to the Weibull distribution was observed. The Weibull modulus was found to depend on the roughness of the substrate, but was insensitive to the adhesion strength.

Erck, R.A.; Nichols, F.A. [Argonne National Lab., IL (United States). Materials and Components Technology Div.; Schult, D.L. [Illinois Univ., Urbana, IL (United States)

1992-11-01

20

NASA Astrophysics Data System (ADS)

We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

Goh, Segun; Kwon, H. W.; Choi, M. Y.

2014-06-01

21

Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

NASA Technical Reports Server (NTRS)

A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

2012-01-01

22

Estimating trends in data from the Weibull and a generalized extreme value distribution

NASA Astrophysics Data System (ADS)

Where changes in hydrologic regime occur, whether as a result of change in land use or climate, statistical procedures are needed to test for the existence of trend in hydrological data, particularly those expected to follow extreme value distributions such as annual peak discharges, annual minimum flows, and annual maximum rainfall intensities. Furthermore, where trend is detected, its magnitude must also be estimated. A later paper [Clarke, 2002] will consider the estimation of trends in Gumbel data; the present paper gives results on tests for the significance of trends in annual and minimum discharges, where these can be assumed to follow a Weibull distribution. The statistical procedures, already fully established in the statistical analysis of survival data, convert the problem into one in which a generalized linear model is fitted to a power-transformed variable having Poisson distribution and calculates the trend coefficients (as well as the parameter in the power transform) by maximum likelihood. The methods are used to test for trend in annual minimum flows over a 19-year period in the River Paraguay at Cáceres, Brazil, and in monthly flows at the same site. Extension of the procedure to testing for trend in data following a generalized extreme value distribution is also discussed. Although a test for time trend in Weibull-distributed hydrologic data is the motivation for this paper, the same approach can be applied in the analysis of data sequences that can be regarded as stationary in time, for which the objective is to explore relationships between a Weibull variate and other variables (covariates) that explain its behavior.

Clarke, Robin T.

2002-06-01

23

Estimation of the parameters of the Weibull distribution from multi-censored samples

ESTIMATION OF THE PARAMETERS OF THE WEIBULL DISTRIBUTION FROM MULTI-CENSORED SAMPLES A Thesis by EDGAR EUGENE SPRINKLE, III Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirement for the degree... of MASTER OF SCIENCE May 1969 Major Subject: Statistics ESTIMATION OF THE PARAMETERS OF THE WEIBULL DISTRIBUTION FROM MULTI-CENSORED SAMPLES A Thesis EDGAR EUGENE SPRINKLE, III Approved as to style and content by: (Head of Department) (Member...

Sprinkle, Edgar Eugene

1969-01-01

24

Predictive Failure of Cylindrical Coatings Using Weibull Analysis

NASA Technical Reports Server (NTRS)

Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.

Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

2002-01-01

25

The suitability of the Normal, Log-normal and three-parameter Weibull probability density functions to model diameter distributions of neem (Azadirachta indica A. Juss.) grown in individual and community plantations in the Tamale Forest District was investigated. The Weibull parameters were estimated by the Maximum likelihood, Moments and Percentile methods. The maximum likelihood estimators (MLE) and moments estimators (ME) were better predictors

David M. Nanang

1998-01-01

26

NASA Technical Reports Server (NTRS)

Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

Gross, Bernard

1996-01-01

27

Composite Weibull-Inverse Transformed Gamma distribution and its actuarial application

NASA Astrophysics Data System (ADS)

This paper introduces a new composite model, namely, composite Weibull-Inverse Transformed Gamma distribution which assumes Weibull distribution for the head up to a specified threshold and inverse transformed gamma distribution beyond it. The closed form of probability density function (pdf) as well as the estimation of parameters by maximum likelihood method is presented. The model is compared with several benchmark distributions and their performances are measured. A well-known data set, Danish fire loss data, is used for this purpose and it's Value at Risk (VaR) using the new model is computed. In comparison to several standard models, the composite Weibull- Inverse Transformed Gamma model proved to be a competitor candidate.

Maghsoudi, Mastoureh; Bakar, Shaiful Anuar Abu; Hamzah, Nor Aishah

2014-07-01

28

Weibull wind speed distribution: Numerical considerations and use with sodar data

NASA Astrophysics Data System (ADS)

Two analyses have been performed of the use of the Weibull distribution to describe wind speed statistics. The first is a combination of theoretical considerations in a common domain of c and k parameters concerning some robust indicators of position, spread, skewness, and kurtosis. The second is a calculation of the Weibull parameters using three differing methods based on a 3-a sodar database. The modified maximum-likelihood method is direct, the method of weighted probability moments considers order statistics, and the method based on the minimum RMSE is iterative. As a result of the theoretical analyses, we propose some simple relationships involving Weibull parameters and the range of a fraction of central data, the variation coefficient, and the Yule-Kendall index, which may be applied practically. The calculation of Weibull parameters has revealed the sharp contrast between day, where the fit was highly satisfactory, and night, mainly below 300 m. Moreover, a seasonal pattern was also observed. The comparison between the methods used also proved satisfactory, particularly during day, whereas a slight disagreement was observed during the night for the method based on the minimum RMSE. Finally, randomly generated samples were used to check the accuracy of the Weibull parameters in the domain analyzed, resulting in small residuals and standard deviations of the values calculated.

PéRez, Isidro A.; SáNchez, M. Luisa; GarcíA, M. ÁNgeles

2007-10-01

29

Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined. PMID:25016270

Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

2014-07-01

30

Optimal conditional confidence interval for the shape parameter of a Weibull distribution

An optimal two-sided conditional confidence interval for the shape parameterof a Weibull probability distribution is constructed. The construction follows the rejection of a preliminary test of significance for the null hypothesis: ? = ?0 where ?0 is a fixed value. The bounds are de- rived according to the method set forth by Meeks and D'Agostino (1983) and subsequently used by

Smail Mahdi

2003-01-01

31

Regression modelling of interval-censored failure time data using the Weibull distribution

A method is described for fitting the Weibull distribution to failure-time data which may be left, right or interval censored. The method generalizes the auxiliary Poisson approach and, as such, means that it can be easily programmed in statistical packages with macro programming capabilities. Examples are given of fitting such models and an implementation in the GLIM package is used

A. J. Scallan

1999-01-01

32

Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength.

Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth's crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

Krumbholz, Michael; Hieronymus, Christoph F; Burchardt, Steffi; Troll, Valentin R; Tanner, David C; Friese, Nadine

2014-01-01

33

Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

2014-01-01

34

Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

NASA Technical Reports Server (NTRS)

Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.

Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.

2007-01-01

35

We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

36

Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

NASA Technical Reports Server (NTRS)

Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

2013-01-01

37

The cure fraction models are usually used to model lifetime time data with long-term survivors. In the present article, we introduce a Bayesian analysis of the four-parameter generalized modified Weibull (GMW) distribution in presence of cure fraction, censored data and covariates. In order to include the proportion of "cured" patients, mixture and non-mixture formulation models are considered. To demonstrate the ability of using this model in the analysis of real data, we consider an application to data from patients with gastric adenocarcinoma. Inferences are obtained by using MCMC (Markov Chain Monte Carlo) methods. PMID:24008248

Martinez, Edson Z; Achcar, Jorge A; Jácome, Alexandre A A; Santos, José S

2013-12-01

38

We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels. PMID:24979434

Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

2014-06-20

39

NASA Technical Reports Server (NTRS)

The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

Wheeler, J. T.

1990-01-01

40

Goodness-of-fit tests for the Weibull and Pareto distributions

We will consider the goodness-of-fit tests for testing a form of the distribution function of the observed random variable. Let a distribution function belongs under hypothesis to a para- metric family. Generally, the limit distributions of statistics, based on the empirical process, depend on the unknown parameters. It was stated in 1955 (see (8)) that this dependence is absent for

Gennadi Martynov

41

NASA Astrophysics Data System (ADS)

A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.

Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah

2014-11-01

42

Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

NASA Technical Reports Server (NTRS)

The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

2002-01-01

43

NASA Technical Reports Server (NTRS)

The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

Shantaram, S. Pai; Gyekenyesi, John P.

1989-01-01

44

Roasted and ground coffee was stored at constant O2partial pressure (0.5–21.3 kPa), aw(0.106–0.408) and temperature (4–35°C). Product acceptability was monitored by use of a modified Weibull Hazard sensory method where the end of shelf-life was the time at which 50% consumers found the product unacceptable. The effect of O2, awand temperature was studied from a kinetics standpoint. Oxygen increase from

C. Cardelli; T. P. Labuza

2001-01-01

45

Analysis of DC current accelerated life tests of GaN LEDs using a Weibull-based statistical model

Gallium-nitride-based light-emitting diode (LED) accelerated life tests were carried out over devices adopting two different packaging schemes (i.e., with plastic transparent encapsulation or with pure metallic package). Data analyses were done using a Weibull-based statistical description with the aim of estimating the effect of high current on device performance. A consistent statistical model was found with the capability to estimate

S. Levada; M. Meneghini; G. Meneghesso; E. Zanoni

2005-01-01

46

Finite-size effects on return interval distributions for weakest-link-scaling systems.

The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the ?-Weibull distribution. The upper tail of the ?-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the ?-Weibull distribution decreases linearly after a waiting time ?(c) ? n(1/m), where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the ? Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the ?-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems. PMID:25353774

Hristopulos, Dionissios T; Petrakis, Manolis P; Kaniadakis, Giorgio

2014-05-01

47

BACKGROUND: We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. RESULTS:

Artem Cherkasov; Shannan J. Ho Sui; Robert C. Brunham; Steven J. M. Jones

2004-01-01

48

The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

Harris, S.; Gross, R.; Mitchell, E.

2011-01-18

49

A Graphical Estimation of Mixed Weibull Parameters in Life-Testing of Electron Tubes

It is widely recognized that electron tube failures may be classified into two types: sudden and delayed. A mixture of two Weibull distributions, each representing one type of tube failure, is proposed, and a simple graphical method for estimating the parameters of the mixed Weibull distribution described.

John H. K. Kao

1959-01-01

50

Estimation of energy output for small-scale wind power generators is the subject of this article. Monthly wind energy production is estimated using the Weibull-representative wind data for a total of 96 months, from 5 different locations in the world. The Weibull parameters are determined based on the wind distribution statistics calculated from the measured data, using the gamma function. The

Ali Naci Celik

2003-01-01

51

Some Graphical Techniques for Estimating Weibull Confidence Intervals

Many methods for estimating the parameter and percentile statistical confidence intervals for the Weibull and Gumbel (extreme value) distributions have been described in the literature. Most of these methods depend on extensive computer programs, require reference to tables which do not cover all sample sizes of interest and\\/or are not widely available. This paper describes a semi-empirical technique which permits

G. C. Stone; H. Rosen

1984-01-01

52

NASA Technical Reports Server (NTRS)

The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

Pai, Shantaram S.; Gyekenyesi, John P.

1988-01-01

53

Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

NASA Astrophysics Data System (ADS)

This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ? 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (?0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

Pasari, Sumanta; Dikshit, Onkar

2014-07-01

54

NASA Technical Reports Server (NTRS)

This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

1990-01-01

55

Reliability analyses and calculations for distribution transformers

This paper discusses methods for estimating reliability parameters of distribution transformers. The data analysis techniques presented have been developed through operating records provided for the period 1970 to 1993 by an Electric Utility Department in the USA. Calculation results show that the failure mode of distribution transformers can be represented by Weibull distribution. Both useful calculation methods and results of

Xianhe Jin; Changchang Wang; Changyu Chen; T. C. Cheng; Aldo Amancio

1999-01-01

56

The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

Dewhurst, Alastair; The ATLAS collaboration

2015-01-01

57

Weibull- k Revisited: "Tall" Profiles and Height Variation of Wind Statistics

NASA Astrophysics Data System (ADS)

The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter ( k), as well as mean wind speed. Towards the aim of improving predictions of the Weibull- profile, we develop expressions for the profile of long-term variance of wind speed, including a method extending our probabilistic wind-profile theory; together these two profiles lead to a profile of Weibull-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85-110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull- k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion of the models' efficacy and applicability. The latter includes a comparative evaluation of Wieringa-type empirical models and perturbed-geostrophic forms with regard to surface-layer behaviour, as well as for heights where climatological wind-speed variability is not dominated by surface effects.

Kelly, Mark; Troen, Ib; Jørgensen, Hans E.

2014-07-01

58

Modeling root reinforcement using a root-failure Weibull survival function

NASA Astrophysics Data System (ADS)

Root networks contribute to slope stability through complex interactions with soil that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamics of root turnover, the quantification of root reinforcement on steep slopes is challenging and consequently the calculation of slope stability also. Although considerable progress has been made, some important aspects of root mechanics remain neglected. In this study we address specifically the role of root-strength variability on the mechanical behavior of a root bundle. Many factors contribute to the variability of root mechanical properties even within a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field data sets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the equations of the tensile force, the elasticity of the roots, and the root distribution are the most important steps. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for tensile, shear and compression behavior allows for the consideration of the stabilization effects of root networks on steep slopes and the influence that this has on the triggering of shallow landslides.

Schwarz, M.; Giadrossich, F.; Cohen, D.

2013-11-01

59

We investigate a large sample approach for obtaining tolerance bounds where the underlying population is a three-parameter Weibull distribution. Accurate tolerance bounds could play an important role in the development of lumber standards. Properties of the maximum likelihood based approach are compared with those of the standard nonparametric tolerance procedure. The asymptotic normal approximation to the tolerance bound was found

Richard A. Johnson; James H. Haskell

1984-01-01

60

NASA Astrophysics Data System (ADS)

A new analytical technique for the statistical analysis of bubble populations in volcanic rocks [Proussevitch, A.A., Sahagian, D.L. and Tsentalovich, E.P., 2007-this issue. Statistical analysis of bubble and crystal size distributions: Formulations and procedures. J. Volc. Geotherm. Res.] has been applied to a collection of Colorado Plateau basalts (96 samples). A variety of mono- and polymodal distributions has been found in the samples, all of which belong to the logarithmic family of statistical functions. Most samples have bimodal log normal distributions, while the others are represented by mono- or bimodal log logistic, and Weibull distributions. We have grouped the observed distributions into 11 groups depending on distribution types, mode location, and intensity. The nature of the curves within these groups can be interpreted as evolution of vesiculation processes. We conclude that within bimodal log normal distributions, the mode of smaller bubbles is the result of a second nucleation and growth event in a lava flow after eruption. In the case of log logistic distributions the larger mode results from coalescence of bubbles. Coalescence processes are reflected in growth of a larger mode and decreasing bubble number density. Another style of population evolution leads to a monomodal Weibull (or exponential) distribution as a result of superposition of multiple log normal distributions in which the modes are comparable in size and intensity. These various population distribution styles can be interpreted with an understanding of vesiculation processes that can be gained through appropriate numerical models of coalescence and population evolution. The applicable vesiculation processes include: a) a single nucleation-growth event, b) continuous multiple nucleation-growth events, c) coalescence, and d) Ostwald ripening.

Proussevitch, Alexander A.; Sahagian, Dork L.; Carlson, William D.

2007-07-01

61

The distribution of first-passage times and durations in FOREX and future markets

NASA Astrophysics Data System (ADS)

Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting time. We find that our distribution is applicable as long as durations follow a Weibull law for short times and do not have too heavy a tail.

Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

2009-07-01

62

Spatial and temporal patterns of global onshore wind speed distribution

NASA Astrophysics Data System (ADS)

Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/climate forecast system reanalysis (CFSR) data over land areas. The Weibull distribution performs well in fitting the time series wind speed data at most locations according to R2, root mean square error, and power density error. The wind speed frequency distribution, as represented by the Weibull k parameter, exhibits a large amount of spatial variation, a regionally varying amount of seasonal variation, and relatively low decadal variation. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in non-negligible errors. While large-scale wind speed data are often presented in the form of mean wind speeds, these results highlight the need to also provide information on the wind speed frequency distribution.

Zhou, Yuyu; Smith, Steven J.

2013-09-01

63

Weibull regression for lifetimes measured with error.

Models are considered in which 'true' lifetimes are generated by a Weibull regression model and measured lifetimes are determined from the true times by certain measurement error models. Adjusted estimators are obtained under one parametric specification. The bias properties of these estimators and standard estimators are compared both theoretically, using small measurement error asymptotics, and by simulation. The standard estimators of regression coefficients, other than the intercept, are bias-robust. The adjusted estimator of the shape parameter removes the bias of the standard estimator. PMID:10214000

Skinner, C J; Humphreys, K

1999-01-01

64

Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

NASA Technical Reports Server (NTRS)

Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

Bavuso, Salvatore J.

1998-01-01

65

A Weibull brittle material failure model for the ABAQUS computer program

A statistical failure theory for brittle materials that traces its origins to the Weibull distribution function is developed for use in the general purpose ABAQUS finite element computer program. One of the fundamental assumptions for this development is that Mode 1 microfractures perpendicular to the direction of the principal stress contribute independently to the fast fracture. The theory is implemented by a user subroutine for ABAQUS. Example problems illustrating the capability and accuracy of the model are given. 24 refs., 12 figs.

Bennett, J.

1991-08-01

66

The log-exponentiated Weibull regression model for interval-censored data

In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed

Elizabeth M. Hashimoto; Edwin M. M. Ortega; Vicente G. Cancho; Gauss M. Cordeiro

2010-01-01

67

A novel approach for statistical analysis of comet assay data (i.e.: tail moment) is proposed, employing public-domain statistical software, the R system. The analytical strategy takes into account that the distribution of comet assay data, like the tail moment, is usually skewed and do not follow a normal distribution. Probability distributions used to model comet assay data included: the Weibull,

Pablo E. Verde; Laura A. Geracitano; Lílian L. Amado; Carlos E. Rosa; Adalto Bianchini; José M. Monserrat

2006-01-01

68

NASA Astrophysics Data System (ADS)

This paper deals with analysis of statistical properties of multi-look processed polarimetric SAR data. Based on an assumption that the multi-look polarimetric measurement is a product between a Gamma-distributed texture variable and a Wishart-distributed polarimetric speckle variable, it is shown that the multi-look polarimetric measurement from a nonhomogeneous region obeys a generalized K-distribution. In order to validate this statistical model, two of its varied versions, multi-look intensity and amplitude K-distributions are particularly compared with histograms of the observed multi-look SAR data of three terrain types, ocean, forest-like and city regions, and with four empirical distribution models, Gaussian, log-normal, gamma and Weibull models. A qualitative relation between the degree of nonhomogeneity of a textured scene and the well-fitting statistical model is then empirically established. Finally, a classifier with adaptive distributions guided by the order parameter of the texture distribution estimated with local statistics is introduced to perform terrain classification, experimental results with both multi-look fully polarimetric data and multi-look single-channel intensity/amplitude data indicate its effectiveness.

Liu, Guoqing; Huang, ShunJi; Torre, Andrea; Rubertone, Franco S.

1995-11-01

69

NASA Astrophysics Data System (ADS)

Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57 and others.

Nadarajah, Saralees; Kotz, Samuel

2007-04-01

70

Spatial and Temporal Patterns of Global Onshore Wind Speed Distribution

Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/CFSR reanalysis data. The estimated Weibull distribution performs well in fitting the time series wind speed data at the global level according to R2, root mean square error, and power density error. The spatial, decadal, and seasonal patterns of wind speed distribution were then evaluated. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in substantial errors. While large-scale wind speed data is often presented in the form of average wind speeds, these results highlight the need to also provide information on the wind speed distribution.

Zhou, Yuyu; Smith, Steven J.

2013-09-09

71

Distributed data analysis in ATLAS

NASA Astrophysics Data System (ADS)

Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

Nilsson, Paul; Atlas Collaboration

2012-12-01

72

Distribution-free discriminant analysis

This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

Burr, T.; Doak, J.

1997-05-01

73

Sugar Cane Nutrient Distribution Analysis

NASA Astrophysics Data System (ADS)

Neutron Activation Analysis (NAA), Molecular Absorption Spectrometry (UV-Vis), and Flame Photometry techniques were applied to measure plant nutrient concentrations of Br, Ca, Cl, K, Mn, N, Na and P in sugar-cane root, stalk and leaves. These data will be used to explore the behavior of element concentration in different parts of the sugar-cane to better understand the plant nutrient distribution during its development.

Zamboni, C. B.; da Silveira, M. A. G.; Gennari, R. F.; Garcia, I.; Medina, N. H.

2011-08-01

74

Weibull Statistics for Upper Ocean Currents with the Fokker-Planck Equation

NASA Astrophysics Data System (ADS)

Upper oceans typically exhibit of a surface mixed layer with a thickness of a few to several hundred meters. This mixed layer is a key component in studies of climate, biological productivity and marine pollution. It is the link between the atmosphere and the deep ocean and directly affects the air-sea exchange of heat, momentum and gases. Vertically averaged horizontal currents across the mixed layer are driven by the residual between the Ekman transport and surface wind stress, and damped by the Rayleigh friction. A set of stochastic differential equations are derived for the two components of the current vector (u, v). The joint probability distribution function of (u, v) satisfies the Fokker-Planck equation (Chu, 2008, 2009), with the Weibull distribution as the solution for the current speed. To prove it, the PDF of the upper (0-50 m) tropical Pacific current speeds (w) was calculated from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project. In fact, it satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies. Besides, the Weibull distribution is also identified from the near-real time ocean surface currents derived from satellite altimeter (JASON-1, GFO, ENVISAT) and scatterometer (QSCAT) data on 1o 1o resolution for world oceans (60o S to 60o N) as "Ocean Surface Current Analyses - Real Time (OSCAR)". Such a PDF has little seasonal and interannual variations. Knowledge on PDF of w will improve the ensemble horizontal flux calculation, which contributes to the climate studies. References Chu, P. C., 2008: Probability distribution function of the upper equatorial Pacific current speeds. Geophysical Research Letters, 35,doi:10.1029/2008GL033669 Chu, P. C., 2009: Statistical Characteristics of the Global Surface Current Speeds Obtained from Satellite Altimeter and Scatterometer Data. IEEE Journal of Selected Topics in Earth Observations and Remote Sensing,2(1),27-32.

Chu, P. C.

2012-12-01

75

Statistical modeling of tornado intensity distributions

NASA Astrophysics Data System (ADS)

We address the issue to determine an appropriate general functional shape of observed tornado intensity distributions. Recently, it was suggested that in the limit of long and large tornado records, exponential distributions over all positive Fujita or TORRO scale classes would result. Yet, our analysis shows that even for large databases observations contradict the validity of exponential distributions for weak (F0) and violent (F5) tornadoes. We show that observed tornado intensities can be much better described by Weibull distributions, for which an exponential remains a special case. Weibull fits in either v or F scale reproduce the observations significantly better than exponentials. In addition, we suggest to apply the original definition of negative intensity scales down to F-2 and T-4 (corresponding to v=0 m s -1) at least for climatological analyses. Weibull distributions allow for an improved risk assessment of violent tornadoes up to F6, and better estimates of total tornado occurrence, degree of underreporting and existence of subcritical tornadic circulations below damaging intensity. Therefore, our results are relevant for climatologists and risk assessment managers alike.

Dotzek, Nikolai; Grieser, Jürgen; Brooks, Harold E.

76

Weibull Effective Area for Hertzian Ring Crack Initiation Stress

Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.

Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL

2011-01-01

77

Analysis of Area Burned by Wildfires Through the Partitioning of a Probability Model1

An analysis of forest fires by using a partitioned probability distribution is presented. Area burned during afire is fitted to a probability model. This model is partitioned into small, medium, and large fires. Conditional expected values are computed for each partition. Two cases are presented: the two-parameter Weibull and the Truncated Shifted Pareto probability models. The methodology allows a comparison

Ernesto Alvarado; David V. Sandberg; Bruce B. Bare

78

A fuzzy-monte carlo simulation approach for fault tree analysis

Fault tree analysis is one of the key approaches used to analyze the reliability of critical systems. Fault trees are usually analyzed using mathematical approaches or Monte Carlo simulation (MCS). This paper presents a fuzzy-Monte Carlo simulation (FMCS) approach in which the uncertain data is generated by the MCS approach. The FMCS approach is applied to the Weibull probability distribution

Saman Aliari Zonouz; Seyed Ghassem Miremadi

2006-01-01

79

Distribution analysis of airborne nicotine concentrations in hospitality facilities.

A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector. PMID:11868665

Schorp, Matthias K; Leyden, Donald E

2002-02-01

80

Effects of cyclic stress distribution models on fatigue life predictions

NASA Astrophysics Data System (ADS)

The fatigue analysis of a wind turbine component typically uses representative samples of cyclic loads to determine lifetime loads. In this paper, several techniques currently in use are compared to one another based on fatigue life analyses. The generalized Weibull fitting technique is used to remove the artificial truncation of large-amplitude cycles that is inherent in relatively short data sets. Using data from the Sandia/DOE 34-m Test Bed, the generalized Weibull file technique is shown to be excellent for matching the body of the distribution of cyclic loads and for extrapolating the tail of the distribution. However, the data also illustrate that the fitting technique is not a substitute for an adequate data base.

Sutherland, H. J.; Veers, P. S.

1994-10-01

81

Experimental design strategy for the Weibull dose-response model (journal version)

The objective of the research was to determine optimum design-point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose-response relationship follows the Weibull. The optimum design is dependent on the values of the Weibull model parameters. A transformation was developed that allowed the optimum design (by the determinant criterion) for one parametric situation to be translated to any other, and permitted the search for optimum designs to be restricted to one set of Weibull parameters. Optimum designs were determined for the case where the Weibull parameters are assumed known, and effects of deviating from the optimum designs were investigated. Several alternative design strategies were considered for protecting against incorrectly guessing the Weibull model parameters when their true values are not known.

Dassel, K.A.; Rawlings, J.O.

1988-01-01

82

Towards Distributed Memory Parallel Program Analysis

This paper presents a parallel attribute evaluation for distributed memory parallel computer architectures where previously only shared memory parallel support for this technique has been developed. Attribute evaluation is a part of how attribute grammars are used for program analysis within modern compilers. Within this work, we have extended ROSE, a open compiler infrastructure, with a distributed memory parallel attribute evaluation mechanism to support user defined global program analysis required for some forms of security analysis which can not be addressed by a file by file view of large scale applications. As a result, user defined security analyses may now run in parallel without the user having to specify the way data is communicated between processors. The automation of communication enables an extensible open-source parallel program analysis infrastructure.

Quinlan, D; Barany, G; Panas, T

2008-06-17

83

Distributed leadership as a unit of analysis

This article proposes a new unit of analysis in the study of leadership. As an alternative to the current focus, which is primarily on the deeds of individual leaders, the article proposes distributed leadership. The article shows how conventional constructs of leadership have difficulty accommodating changes in the division of labor in the workplace, especially, new patterns of interdependence and

Peter Gronn

2002-01-01

84

Parton distributions: a new global analysis

We present a new analysis of parton distributions of the proton. This incorporates a wide range of new data, an improved treatment of heavy flavours and a re-examination of prompt photon production. The new set (MRST) shows systematic differences from previous sets of partons which can be identified with particular features of the new data and with improvements in the

A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

1998-01-01

85

Gene Microarray Analysis using Angular Distribution Decomposition

Gene Microarray Analysis using Angular Distribution Decomposition Karen Lees1 , Stephen Roberts1 of microarray data to group genes with similar expression profiles. The similarity of expres- sion profiles to define the similarity of gene expression patterns. The pairwise comparisons of exper- imental conditions

Roberts, Stephen

86

The Lindley distribution applied to competing risks lifetime data.

Competing risks data usually arises in studies in which the death or failure of an individual or an item may be classified into one of k ? 2 mutually exclusive causes. In this paper a simple competing risks distribution is proposed as a possible alternative to the Exponential or Weibull distributions usually considered in lifetime data analysis. We consider the case when the competing risks have a Lindley distribution. Also, we assume that the competing events are uncorrelated and that each subject can experience only one type of event at any particular time. PMID:21550685

Mazucheli, Josmar; Achcar, Jorge A

2011-11-01

87

Generalized perton distributions: analysis and applications

Results from a recent analysis of the zero-skewness generalized parton distributions (GPDs) for valence quarks are discussed. The analysis bases on a physically motivated parameterization of the GPDs with a few free parameters adjusted to the available nucleon form factor data. Various moments of the GPDs as well as their Fourier transfroms, the quark densities in the impact parameter plane, are also presented. The 1/x moments of the zero-skewness GPDs are form factors specific to Compton scattering off protons within the handbag approach. The results of the GPD analysis enables one to predict Compton scattering.

P. Kroll

2004-12-14

88

Calibration of three-parameter Weibull stress model for 15Kh2NMFA RPV steel

NASA Astrophysics Data System (ADS)

Weibull stress model represents a basic local approach model used in the ductile-to-brittle transition region for description and prediction of cleavage fracture for materials of both PWR and WWER reactor pressure vessels. In the Weibull stress model used most frequently until now [1], the parameters are determined by a calibration procedure using the fracture toughness values of high and low constraint specimens. In the present paper, the results of SEN(B) pre-cracked specimens of 10 × 20 × 120 mm size, with deep and shallow cracks, are utilized. Specimens were made of material of WWER-1000 reactor pressure vessel, and were tested at Nuclear Research Institute Rez. Determination of Weibull stress was performed for both the case of including plastic strain correction into the Weibull stress formula and without it.

Beleznai, Robert; Lenkey, Gyöngyvér B.; Lauerova, Dana

2010-11-01

89

A power series beta Weibull regression model for predicting breast carcinoma.

The postmastectomy survival rates are often based on previous outcomes of large numbers of women who had a disease, but they do not accurately predict what will happen in any particular patient's case. Pathologic explanatory variables such as disease multifocality, tumor size, tumor grade, lymphovascular invasion, and enhanced lymph node staining are prognostically significant to predict these survival rates. We propose a new cure rate survival regression model for predicting breast carcinoma survival in women who underwent mastectomy. We assume that the unknown number of competing causes that can influence the survival time is given by a power series distribution and that the time of the tumor cells left active after the mastectomy for metastasizing follows the beta Weibull distribution. The new compounding regression model includes as special cases several well-known cure rate models discussed in the literature. The model parameters are estimated by maximum likelihood. Further, for different parameter settings, sample sizes, and censoring percentages, some simulations are performed. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess local influences. The potentiality of the new regression model to predict accurately breast carcinoma mortality is illustrated by means of real data. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25620602

Ortega, Edwin M M; Cordeiro, Gauss M; Campelo, Ana K; Kattan, Michael W; Cancho, Vicente G

2015-04-15

90

Group Sequential Design for Randomized Phase III Trials under the Weibull Model.

Abstract In this paper, a parametric sequential test is proposed under the Weibull model. The proposed test is asymptotically normal with an independent increments structure. The sample size for fixed sample test is derived for the purpose of group sequential trial design. In addition, a multi-stage group sequential procedure is given under the Weibull model by applying the Brownian motion property of the test statistic and sequential conditional probability ratio test methodology. PMID:25322440

Wu, Jianrong; Xiong, Xiaoping

2014-10-16

91

The pharmacokinetics of valproic acid after oral administration of sustained-release formulations were studied in 12 healthy volunteers. The objective of the present study was to find an appropriate mathematical model to describe the complex drug intake process. The concentration of valproic acid in plasma was measured by HPLC. For each subject, during the input process a double peak phenomenon was observed, the plasma concentrations were fitted according to a single or a double Weibull input function, and then a first-order elimination rate was used to describe the observed data. The Weibull model was considered as an approximation of the overall process. The mean peak plasma concentration, 34.6 +/- 8.9 mg/L, was reached after 8.6 +/- 2.7 h. A single Weibull function adequately described the observed data for three subjects; the mean Weibull parameters were td (the time necessary to transfer 63% of the administered drug into the systemic circulation) of 7.87 +/- 3.53 h and gamma (shape) of 1.16 +/- 0.66. A double Weibull input function was used for nine subjects; the mean Weibull parameters were td1 = 2.35 +/- 1.18 h and td2 = 9.36 +/- 4.47 h and gamma 1 = 1.77 +/- 2.27 and gamma 2 = 3.68 +/- 3.26. The mean half-life value of the elimination phase was 14.4 +/- 4.6 h. PMID:7884670

Bressolle, F; Gomeni, R; Alric, R; Royer-Morrot, M J; Necciari, J

1994-10-01

92

Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

NASA Technical Reports Server (NTRS)

This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

1990-01-01

93

Thermal fatigue reliability analysis for PBGA with Sn3.8Ag0.7Cu solder joints

In this work, thermal cycling reliability test and analysis for PBGA components with Sn-3.8Ag-0.7Cu solder joints were investigated. Based on test results, a two-parameter Weibull distribution model was used to determine the mean time to failure (MTTF) of PBGA components. The MTTF was used for validation of finite element analysis (FEA) results. FEA analysis using quarter model and submodeling method

F. X. Che; J. H. L. Pang

2004-01-01

94

Parton distributions: a new global analysis

. We present a new analysis of parton distributions of the proton. This incorporates a wide range of new data, an improved\\u000a treatment of heavy flavours and a re-examination of prompt photon production. The new set (MRST) shows systematic differences\\u000a from previous sets of partons which can be identified with particular features of the new data and with improvements in

A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

1998-01-01

95

Parton distributions: a new global analysis

We present a new analysis of parton distributions of the proton. This\\u000aincorporates a wide range of new data, an improved treatment of heavy flavours\\u000aand a re-examination of prompt photon production. The new set (MRST) shows\\u000asystematic differences from previous sets of partons which can be identified\\u000awith particular features of the new data and with improvements in the

A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

1998-01-01

96

Performance of Weibull and Linear Semi-logarithmic Models in Simulating Inactivation in Waters.

Modeling inactivation of indicator microorganisms is a necessary component of microbial water quality forecast and management recommendations. The linear semi-logarithmic (LSL) model is commonly used to simulate the dependencies of bacterial concentrations in waters on time. There were indications that assumption of the semi-logarithmic linearity may not be accurate enough in waters. The objective of this work was to compare performance of the LSL and the two-parametric Weibull inactivation models with data on survival of indicator organism in various types of water from a representative database of 167 laboratory experiments. The Weibull model was preferred in >99% of all cases when the root mean squared errors and Nash-Sutcliffe statistics were compared. Comparison of corrected Akaike statistic values gave the preference to the Weibull model in only 35% of cases. This was caused by (i) a small number of experimental points on some inactivation curves, (ii) closeness of the shape parameter of the Weibull equation to one, and (iii) piecewise log-linear inactivation dynamic that could be well described by neither of the two models compared. Based on the Akaike test, the Weibull model was favored in agricultural, lake, and pristine waters, whereas the LSL model was preferred for groundwater, wastewater, rivers, and marine waters. The decimal reduction time parameter of both the LSL and Weibull models exhibited an Arrhenius-type dependence on temperature. Overall, the existing inactivation data indicate that the application of the Weibull model can improve the predictive capabilities of microbial water quality modeling. PMID:25603241

Stocker, M D; Pachepsky, Y A; Shelton, D R

2014-09-01

97

Distributed processing for multiresolution dynamic scene analysis

High computational cost has long been a problem in computer vision. A model to alleviate this problem in motion analysis is presented. The model is a hybrid of three techniques, namely, hierarchical structures, parallelism, and selective analysis control. A pipelined pyramid image structure is constructed in the model by continually converging incoming images into successively coarser resolutions. The model also contains a set of processes that work concurrently and asynchronously on subimages at different levels of this pyramid structure. These processes initially watch for interesting features in the coarsest resolution rendition of the scene. Processes working on promising areas individually but cooperatively proceed to progressively finer-resolution levels according to a selective-analysis control strategy. The control mechanism is afforded through a blackboard structure, which also permits a unified-scene interpretation. The model was implemented in a simulated distributed system. Implementation results with different image sequences including ambiguous and conclusion scenes are presented.

Tan, C.L.

1986-01-01

98

Performance optimisations for distributed analysis in ALICE

NASA Astrophysics Data System (ADS)

Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

2014-06-01

99

Time-dependent reliability analysis of ceramic engine components

The computer program CARES\\/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and\\/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES\\/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution

Noel N. Nemeth

1993-01-01

100

Analysis of Voltage Rise Effect on Distribution Network with Distributed

and Formby (2000). Since the modern distribution systems are designed to accept bulk power from. Hence, there is dramatic changes in the nature of distri- bution networks with distributed generation or disconnection of a large penetration of distributed generation may become problematic which may lead to sudden

Pota, Himanshu Roy

101

Analysis and control of distributed cooperative systems.

As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

2004-09-01

102

Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India

NASA Astrophysics Data System (ADS)

The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.

Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.

2014-09-01

103

Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model

NASA Astrophysics Data System (ADS)

Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.

Al Sobhi, Mashail M.

2015-02-01

104

NASA Astrophysics Data System (ADS)

We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.

Sazuka, Naoya

2007-03-01

105

Buffered Communication Analysis in Distributed Multiparty Sessions

NASA Astrophysics Data System (ADS)

Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

Deniélou, Pierre-Malo; Yoshida, Nobuko

106

Fracture mechanics concepts in reliability analysis of monolithic ceramics

NASA Technical Reports Server (NTRS)

Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

Manderscheid, Jane M.; Gyekenyesi, John P.

1987-01-01

107

ATLAS Distributed Data Analysis: challenges and performance

In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...

Fassi, Farida; The ATLAS collaboration

2015-01-01

108

Holistic schedulability analysis for distributed hard real-time systems

This paper extends the current analysis associated with static priority pre-emptive based scheduling to address the wider problem of analysing schedulability of a distributed hard real-time system; in particular it derives analysis for a distributed system where tasks with arbitrary deadlines communicate by message passing and shared data areas. A simple TDMA protocol is assumed, and analysis developed to bound

Ken Tindell; John Clark

1994-01-01

109

Likelihood analysis of earthquake focal mechanism distributions

NASA Astrophysics Data System (ADS)

In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad hoc, empirical assumptions, thus their performance is questionable. We apply a conventional likelihood method to measure the skill of earthquake focal mechanism orientation forecasts. The advantage of such an approach is that earthquake rate prediction can be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. We measure the difference between two double-couple sources as the minimum rotation angle that transforms one into the other. We measure the uncertainty of a focal mechanism forecast (the variability), and the difference between observed and forecasted orientations (the prediction error), in terms of these minimum rotation angles. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random (or equally probable). For 3-D rotation the random rotation angle distribution is not uniform. To better understand the resulting complexities, we calculate the information (likelihood) score for two theoretical rotational distributions (Cauchy and von Mises-Fisher), which are used to approximate earthquake source orientation pattern. We then calculate the likelihood score for earthquake source forecasts and for their validation by future seismicity data. Several issues need to be explored when analyzing observational results: their dependence on forecast and data resolution, internal dependence of scores on forecasted angle and random variability of likelihood scores. Here, we propose a simple tentative solution but extensive theoretical and statistical analysis is needed.

Kagan, Yan Y.; Jackson, David D.

2015-06-01

110

Development of pair distribution function analysis

This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO{sub 2} planes of high-{Tc} superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-{Tc} superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF.

Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

1996-09-01

111

NASA Technical Reports Server (NTRS)

The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

Gyekenyesi, John P.; Nemeth, Noel N.

1987-01-01

112

Building Finite Element Analysis Programs in Distributed Services Environment

software development. Practicing engineers today usually perform finite element structural analyses1. Building Finite Element Analysis Programs in Distributed Services Environment Jun Peng1 and Kincho H. Law2 Abstract Traditional finite element analysis (FEA) programs are typically built

Stanford University

113

It has been shown in [2] that the strength of Aramide fibers can be described with a sufficient degree of accuracy by means of the statistical theory of strength [3]. Weibull's distribution was taken as the distribution function for the strength of monofibers, It was found that Weibull's function can be unimodal or bimodal, depending on the structure of the

L. V. Kompaniets; V. V. Potapov; G. A. Grigoryan; A. M. Kuperman; L. V. Puchkov; É. S. Zelenskii; É. V. Prut; N. S. Enikolopyan

1983-01-01

114

Time-dependent reliability analysis of ceramic engine components

The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

Nemeth, N.N.

1993-10-01

115

Time-dependent reliability analysis of ceramic engine components

NASA Technical Reports Server (NTRS)

The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

Nemeth, Noel N.

1993-01-01

116

Design and analysis of a distributed regenerative frequency divider using distributed mixer

In this paper we present the design and analysis of a distributed regenerative frequency divider (DRFD) based on a novel distributed single-balanced mixer. Artificial transmission lines are incorporated in the distributed single balanced mixer to absorb the parasitic capacitances. The circuit is realized in a 0.18 µ µ µ µm standard CMOS process. It shows a division by two for

Amin Q. Safarian; Payam Heydari

2004-01-01

117

On Distributions of Generalized Order Statistics

In a wide subclass of generalized order statistics, representations of marginal density and distribution functions are developed. The results are applied to obtain several rela- tions, such as recurrence relations, and explicit expressions for the moments of generalized order statistics from Pareto, power function and Weibull distributions. Moreover, charac- terizations of exponential distributions are shown by means of a distributional

Udo Kamps; Erhard Cramer

118

The Complexity of Distributed Collaborative Learning: Unit of Analysis

The Complexity of Distributed Collaborative Learning: Unit of Analysis Annita Fjuk, Telenor R of this paper is manifested in the new conditions that characterise distributed collaborative learning. The core is manifested in the new conditions that characterise distributed collaborative learning as a phenomenon. Based

Boyer, Edmond

119

Bayesian Analysis of Zero-Inflated Distributions

In this paper zero-inflated distributions (ZID) are studied from the Bayesian point of view using the data augmentation algorithm. This type of discrete model arises in count data with excess of zeros. The zero-inflated Poisson distribution (ZIP) and an illustrative example via MCMC algorithm are considered.

Josemar Rodrigues

2003-01-01

120

Rate Distortion Analysis in Distributed Video Yassine Belkhouche

1 Rate Distortion Analysis in Distributed Video Coding Yassine Belkhouche Department of Computer in lossy video coding is the rate-distortion (RD) analysis, which provides possible tradeoffs between Texas Abstract In recent years, distributed video coding emerged as an alternative to conventional video

Namuduri, Kamesh

121

Shark: Fast Data Analysis Using Coarse-grained Distributed Memory

Shark: Fast Data Analysis Using Coarse-grained Distributed Memory Cliff Engle, Antonio Lupher {cengle, alupher, rxin, matei, franklin, shenker, istoica}@cs.berkeley.edu ABSTRACT Shark is a research data analysis system built on a novel coarse-grained distributed shared-memory abstraction. Shark

California at Irvine, University of

122

Closed form expressions for choice probabilities in the Weibull case

For a probabilistic discrete choice model with independent reversed Gumbel distributed random costs, closed form expressions for the choice probabilities are known under the assumption that the variance is the same for all choice alternatives. This assumption is highly disputable in many cases in reality. In this paper, closed form expressions for the choice probabilities are derived for the case

Enrique Castillo; José María Menéndez; Pilar Jiménez; Ana Rivas

2008-01-01

123

Robust Cluster Analysis Via Mixtures Of Multivariate tDistributions

Robust Cluster Analysis Via Mixtures Of Multivariate tDistributions Geoffrey J. Mc in the impor tant field of cluster analysis. Besides having a sound mathematical basis, this approach it is highly desirable for methods of cluster analysis to be robust. By robustness, it is meant tha

McLachlan, Geoff

124

Distributed reachability analysis in timed automata

We evaluate a distributed reachability algorithm suitable for verification of real time critical systems modeled as timed automata. It is discovered that the algorithm suffers from load balancing problems and a high communication overhead. The load balancing problems are caused by the symbolic nature of the representation of the states of a timed automaton. We propose alternative data structures for

Gerd Behrmann

2005-01-01

125

Hierarchical analysis of power distribution networks

Careful design and verification of the power distribution network of a chip are of critical importance to ensure its reliable performance. With the increasing number of transistors on a chip, the size of the power network has grown so large as to make the verification task very challenging. The available computational power and memory resources impose limitations on the size

Min Zhao; Rajendran V. Panda; Sachin S. Sapatnekar; Tim Edwards; Rajat Chaudhry; David Blaauw

2000-01-01

126

Hierarchical analysis of power distribution networks

Careful design and verification of the power distribution network of a chip are of critical importance to ensure its reliable performance. With the increasing number of transistors on a chip, the size of the power network has grown so large as to make the verification task very challenging. The available computational power and memory resources impose limitations on the size

Min Zhao; Rajendran V. Panda; Sachin S. Sapatnekar; David T. Blaauw

2002-01-01

127

DISTRIBUTION SYSTEM RELIABILITY ANALYSIS USING A MICROCOMPUTER

Distribution system reliability for most utilities is maintained by the knowledge of a few key personnel. Generally, these water maintenance personnel use a good memory, repair records, a large wall map and a hydraulic model of the larger transmission mains to help identify probl...

128

Economic analysis of efficient distribution transformer trends

This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

1998-03-01

129

A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

ERIC Educational Resources Information Center

Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

Ritter, Nicola L.

2012-01-01

130

WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

The user?s guide entitled ?Water Distribution System Analysis: Field Studies, Modeling and Management? is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

131

Silk Fiber Mechanics from Multiscale Force Distribution Analysis Murat Cetinkaya,

Silk Fiber Mechanics from Multiscale Force Distribution Analysis Murat Cetinkaya, Senbo Xiao, Bernd the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular

GrÃ¤ter, Frauke

132

Bayesian analysis for inversion of aerosol size distribution data

Obtaining the continuous aerosol size distribution from a set of discrete measurements is an ill-posed problem. We use a new methodology based on a synthesis of Bayesian probability analysis and Monte Carlo simulations to estimate the parameters of a bimodal lognormal size distribution, i.e. the mass median diameters and geometric standard deviations of the two modes, and the fraction of

Gurumurthy Ramachandran; Milind Kandlikar

1996-01-01

133

Applying Quantile Regression to Analysis of AFIS Cotton Fiber Distribution

Varying fi ber length distributions of cotton, Gossypium hirsutum L., impacts its spinning performance. Advanced Fiber Information Sys- tem (AFIS) facilitates the analysis of the length distribution of individual fi bers in cotton. Quan- tile regression is a variant of standard regres- sion with which conditional quantile values can be calculated by minimizing weighted sums of absolute deviations across the

Brian W. Gardunia; Chris Braden; Eric Hequet; C. Wayne Smith

2008-01-01

134

The Complexity of Distributed Collaborative Learning: Unit of Analysis

The problem area of this paper is manifested in the new conditions that characterise distributed collaborative learning. The core argument is that distributed collaborative learning implies an interconnected complexity that can only be properly understood by extending the unit of analysis from technology and pedagogy themselves to real-life social contexts in which networked computers are being used. Experiments and small-scale

Annita Fjuk; Sten Ludvigsen

2001-01-01

135

Performance analysis of static locking in replicated distributed database systems

NASA Technical Reports Server (NTRS)

Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

Kuang, Yinghong; Mukkamala, Ravi

1991-01-01

136

Performance analysis of static locking in replicated distributed database systems

NASA Technical Reports Server (NTRS)

Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

Kuang, Yinghong; Mukkamala, Ravi

1991-01-01

137

Strength statistics and the distribution of earthquake interevent times

NASA Astrophysics Data System (ADS)

The Weibull distribution is often used to model the earthquake interevent times distribution (ITD). We propose a link between the earthquake ITD on single faults with the Earth’s crustal shear strength distribution by means of a phenomenological stick-slip model. For single faults or fault systems with homogeneous strength statistics and power-law stress accumulation we obtain the Weibull ITD. We prove that the moduli of the interevent times and crustal shear strength are linearly related, while the time scale is an algebraic function of the scale of crustal shear strength. We also show that logarithmic stress accumulation leads to the log-Weibull ITD. We investigate deviations of the ITD tails from the Weibull model due to sampling bias, magnitude cutoff thresholds, and non-homogeneous strength parameters. Assuming the Gutenberg-Richter law and independence of the Weibull modulus on the magnitude threshold, we deduce that the interevent time scale drops exponentially with the magnitude threshold. We demonstrate that a microearthquake sequence from the island of Crete and a seismic sequence from Southern California conform reasonably well to the Weibull model.

Hristopulos, Dionissios T.; Mouslopoulou, Vasiliki

2013-02-01

138

Competitive Analysis of Distributed Algorithms James Aspnes?

of Computer Science Abstract. Most applications of competitive analysis have involved on- line problems where system may fail or behave badly. In a dis- tributed system, processes may crash, run at wildly varying and the environment provided by the system below| by exploiting this split it is possible, among other things

Aspnes, James

139

Statistical distribution analysis of rubber fatigue data

Average rubber fatigue resistance has previously been related to such factors as elastomer type, cure system, cure temperature, and stress history. This paper extends this treatment to a full statistical analysis of rubber fatigue data. Analyses of laboratory fatigue data are used to predict service life. Particular emphasis is given to the prediction of early tire splice failures, and to

J. L. DeRudder

1981-01-01

140

Elemental distribution analysis of urinary crystals

Various crystals are seen in human urine. Some of them, particularly calcium oxalate dihydrate, are seen normally. Pathological\\u000a crystals indicate crystal formation initiating urinary stones. Unfortunately, many of the relevant crystals are not recognized\\u000a in light microscopic analysis of the urinary deposit performed in most of the clinical laboratories. Many crystals are not\\u000a clearly identifiable under the ordinary light microscopy.

Y. M. Fazil Marickar; P. R. Lekshmi; Luxmi Varma; Peter Koshy

2009-01-01

141

Modelling and Analysis of Distributed Simulation Protocols with Distributed Graph Transformation

to analysis. Distrib- uted graph transformation [16] (DGT) was developed with the aim to naturally express show that DGT is a suitable framework for the modelling (i.e. design) and analysis of distributed. The framework of DGT allows describing both the protocol and the specification language semanti

de Lara, Juan

142

Performance Analysis for Teraflop Computers: A Distributed Automatic Approach

Performance analysis for applications on teraflop computers requires a new combination of concepts: online processing, automation, and distribution. The article presents the design of a new analysis system that performs an automatic search for performance problems. This search is guided by a specification of performance properties based on the APART Specification Language. The system is being implemented as a network

Michael Gerndt; Andreas Schmidt; Martin Schulz; Roland Wismüller

2002-01-01

143

Cost-benefit analysis of potassium iodide distribution programs

An analysis has been performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence

Aldrich

1982-01-01

144

Grammatical analysis as a distributed neurobiological function.

Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences--inflectionally complex words and minimal phrases--and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880

Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

2015-03-01

145

Grammatical Analysis as a Distributed Neurobiological Function

Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences—inflectionally complex words and minimal phrases—and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880

Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

2015-01-01

146

Complex network analysis of water distribution systems

NASA Astrophysics Data System (ADS)

This paper explores a variety of strategies for understanding the formation, structure, efficiency, and vulnerability of water distribution networks. Water supply systems are studied as spatially organized networks for which the practical applications of abstract evaluation methods are critically evaluated. Empirical data from benchmark networks are used to study the interplay between network structure and operational efficiency, reliability, and robustness. Structural measurements are undertaken to quantify properties such as redundancy and optimal-connectivity, herein proposed as constraints in network design optimization problems. The role of the supply demand structure toward system efficiency is studied, and an assessment of the vulnerability to failures based on the disconnection of nodes from the source(s) is undertaken. The absence of conventional degree-based hubs (observed through uncorrelated nonheterogeneous sparse topologies) prompts an alternative approach to studying structural vulnerability based on the identification of network cut-sets and optimal-connectivity invariants. A discussion on the scope, limitations, and possible future directions of this research is provided.

Yazdani, Alireza; Jeffrey, Paul

2011-03-01

147

Adaptive walks and distribution of beneficial fitness effects.

We study the adaptation dynamics of a maladapted asexual population on rugged fitness landscapes with many local fitness peaks. The distribution of beneficial fitness effects is assumed to belong to one of the three extreme value domains, viz. Weibull, Gumbel, and Fréchet. We work in the strong selection-weak mutation regime in which beneficial mutations fix sequentially, and the population performs an uphill walk on the fitness landscape until a local fitness peak is reached. A striking prediction of our analysis is that the fitness difference between successive steps follows a pattern of diminishing returns in the Weibull domain and accelerating returns in the Fréchet domain, as the initial fitness of the population is increased. These trends are found to be robust with respect to fitness correlations. We believe that this result can be exploited in experiments to determine the extreme value domain of the distribution of beneficial fitness effects. Our work here differs significantly from the previous ones that assume the selection coefficient to be small. On taking large effect mutations into account, we find that the length of the walk shows different qualitative trends from those derived using small selection coefficient approximation. PMID:24274696

Seetharaman, Sarada; Jain, Kavita

2014-04-01

148

Modeling and analysis of solar distributed generation

NASA Astrophysics Data System (ADS)

Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

Ortiz Rivera, Eduardo Ivan

149

Canonical approximation in the performance analysis of distributed systems

The problem of analyzing distributed systems arises in many areas of computer science, such as communication networks, distributed data bases, packet radio networks, VLSI communications and switching mechanisms. Analysis of distributed systems is difficult since one must deal with many tightly-interacting components. For the stochastic models of these systems, whose steady-state probability is of the product form, many global-performance measures of interest can be computed once one knows the normalization constant of the steady-state probability distribution. This study introduces a new approximation technique to analyze a variety of such models of distributed systems. This technique, the method of canonical approximation, is similar to that developed in statistical physics to compute the partition function. The new method gives a closed-form approximation of the partition function and of the global-performance measures.

Pinsky, E.

1986-01-01

150

Software component quality has a major influence in software development project performances such as lead-time, time to market and cost. It also affects the other projects within the organization, the people assigned into the projects and the organization in general. Software development organization must have indication and prediction about software component quality and project performances in general. One of the

Lovre Hribar

2009-01-01

151

and Bacillus pumilus) as10 well as for upward concavity curves (Clostridium botulinum). It was shown that11 survival curves of spores. This model9 was suitable for downward concavity curves (Bacillus cereus successfully checked the model for Clostridium botulinum and Bacillus49 stearothermophilus spores

Paris-Sud XI, UniversitÃ© de

152

The aim of this research was to study the behaviour of the drying kinetics of pepino fruit (Solanum muricatum Ait.) at five temperatures (50, 60, 70, 80 and 90 °C). In addition, desorption isotherms were determined at 20, 40 and 60 °C\\u000a over a water activity range from 0.10 to 0.90. The Guggenheim, Anderson and de Boer model was suitable to depict

Elsa Uribe; Antonio Vega-Gálvez; Karina Di Scala; Romina Oyanadel; Jorge Saavedra Torrico; Margarita Miranda

153

An application of Bayesian analysis to medical follow-up data.

Posterior distributions can provide effective summaries of the main conclusions of medical follow-up studies. In this article, we use Bayesian methods for the analysis of survival data. We describe posterior distributions for various parameters of clinical interest in the presence of arbitrary right censorship. Non-informative reference priors result from transformation of a two-parameter Weibull model into a location-scale family. We suggest an approach for checking adequacy. For illustration, we apply the methods to a well-known acute leukemia data set. PMID:4089352

Achcar, J A; Brookmeyer, R; Hunter, W G

1985-01-01

154

ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS

Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).

Tuffner, Francis K.; Singh, Ruchi

2011-08-09

155

Energy loss analysis of an integrated space power distribution system

NASA Technical Reports Server (NTRS)

The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

Kankam, M. David; Ribeiro, P. F.

1992-01-01

156

GIS-based poverty and population distribution analysis in China

NASA Astrophysics Data System (ADS)

Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

Cui, Jing; Wang, Yingjie; Yan, Hong

2009-07-01

157

Scenario-Driven Dynamic Analysis of Distributed Architectures

Software architecture constitutes a promising approach to the de- velopment of large-scale distributed systems, but architecture description languages (ADLs) and their associated architectural analysis techniques suf- fer from several important shortcomings. This paper presents a novel ap- proach that reconceptualizes ADLs within the model-driven engineering (MDE) paradigm to address their shortcomings. Our approach combines ex- tensible modeling languages based on

George Edwards; Sam Malek; Nenad Medvidovic

2007-01-01

158

Analysis of Hawaii Biomass Energy Resources for Distributed Energy Applications

Analysis of Hawaii Biomass Energy Resources for Distributed Energy Applications Prepared for State) concentrations on a unit energy basis for sugar cane varieties and biomass samples Energy Institute School of Ocean and Earth Sciences and Technology Scott Q. Turn Vheissu Keffer Milton

159

Data synthesis and display programs for wave distribution function analysis

NASA Technical Reports Server (NTRS)

At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

Storey, L. R. O.; Yeh, K. J.

1992-01-01

160

The Distributive Economic Impacts of Hawaii's Fishery: A SAM Analysis

of accounts throughout the economy · both a data system and a conceptual framework useful for policy analysis Center for Funding this Work #12;Objective: Construct a Social Accounting Matrix (SAM) model that can, Profits, Rents, Interest, etc. Socioeconomic Household Groups, Companies, Government Income Distribution

Hawai'i at Manoa, University of

161

Combining Probability Distributions From Experts in Risk Analysis

Abstract This paper concerns the combination of experts’ probability distributions in risk analysis, discussing a variety of combination methods,and attempting to highlight the important conceptual and practical issues to be considered in designing a combination process in practice. The role of experts is important because their judgments can provide valuable information, particularly in view of the limited availability of “hard

Robert T. Clemen; Robert L. Winkler

1999-01-01

162

Distributional uncertainty analysis using power series and polynomial chaos expansions

This paper provides an overview of computationally efficient approaches for quantifying the influence of parameter uncertainties on the states and outputs of nonlinear dynamical systems with finite-time control trajectories, focusing primarily on computing probability distributions. The advantages and disadvantages of various uncertainty analysis approaches, which use approximate representations of the full nonlinear model using power series or polynomial chaos expansions,

Z. K. Nagy; R. D. Braatz

2007-01-01

163

Approximate methods for uncertainty analysis of water distribution systems

Monte Carlo simulation (MCS) has been commonly applied for uncertainty analysis of model predictions. However, when modelling a water distribution system under unsteady conditions, the computational demand of MCS is quite high even for a reasonably sized system. The aim of this study is to evaluate alternative approximation schemes and examine their ability to predict model prediction uncertainty with less

D. S. Kang; M. F. K. Pasha; K. Lansey

2009-01-01

164

Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

ERIC Educational Resources Information Center

This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

Hoge, Henry W., Comp.

165

Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?

NASA Technical Reports Server (NTRS)

We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.

Kanefsky, B.; Barlow, N. G.; Gulick, V. C.

2001-01-01

166

REGIONAL FLOOD FREQUENCY ANALYSIS WITH A THEORETICALLY DERIVED DISTRIBUTION FUNCTION

REGIONAL FLOOD FREQUENCY ANALYSIS WITH A THEORETICALLY DERIVED DISTRIBUTION FUNCTION P. CLAPS1 , M.iacobellis@poliba.it, Via E. Orabona, 4, 70125, Bari, Italy ABSTRACT The estimation of flood frequency curves in ungauged to estimation of flood quantiles, and for transferring hydrological information between basins. In this context

Poggi, Davide

167

Probabilistic Life and Reliability Analysis of Model Gas Turbine Disk

NASA Technical Reports Server (NTRS)

In 1939, W. Weibull developed what is now commonly known as the "Weibull Distribution Function" primarily to determine the cumulative strength distribution of small sample sizes of elemental fracture specimens. In 1947, G. Lundberg and A. Palmgren, using the Weibull Distribution Function developed a probabilistic lifing protocol for ball and roller bearings. In 1987, E. V. Zaretsky using the Weibull Distribution Function modified the Lundberg and Palmgren approach to life prediction. His method incorporates the results of coupon fatigue testing to compute the life of elemental stress volumes of a complex machine element to predict system life and reliability. This paper examines the Zaretsky method to determine the probabilistic life and reliability of a model gas turbine disk using experimental data from coupon specimens. The predicted results are compared to experimental disk endurance data.

Holland, Frederic A.; Melis, Matthew E.; Zaretsky, Erwin V.

2002-01-01

168

Rapid Analysis of Mass Distribution of Radiation Shielding

NASA Technical Reports Server (NTRS)

Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

Zapp, Edward

2007-01-01

169

Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

NASA Astrophysics Data System (ADS)

In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

2014-01-01

170

Distributed signal analysis of free-floating paraboloidal membrane shells

NASA Astrophysics Data System (ADS)

Multifarious thin paraboloidal shell structures with unique geometric characteristics are utilized in aerospace, telecommunication and other engineering applications over the years. Governing equations of motion of paraboloidal shells are complicated and closed-form analytical solutions of these partial differential equations (PDEs) are difficult to derive. Furthermore, distributed monitoring technique and its resulting global sensing signals of thin flexible membrane shells are not well understood. This study focuses on spatially distributed modal sensing characteristics of free-floating flexible paraboloidal membrane shells laminated with distributed sensor patches based on a new set of assumed mode shape functions. In order to evaluate overall sensing/control effects, microscopic sensing signal characteristic, sensor segmentation and location of distributed sensors on thin paraboloidal membrane shells with different curvatures are investigated. Parametric analysis suggests that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, while all bending strains vanish in membrane shells. This study (1) demonstrates an analysis method for distributed sensors laminated on lightweight paraboloidal flexible structures and (2) identifies critical components and regions that generate significant signals for various shell modes.

Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

2007-07-01

171

Integrating software architectures for distributed simulations and simulation analysis communities.

The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

2005-10-01

172

Influence Of Lateral Load Distributions On Pushover Analysis Effectiveness

The effectiveness of two simple load distributions for pushover analysis recently proposed by the authors is investigated through a comparative study, involving static and dynamic analyses of seismic response of eccentrically braced frames. It is shown that in the upper floors only multimodal pushover procedures provide results close to the dynamic profile, while the proposed load patterns are always conservative in the lower floors. They over-estimate the seismic response less than the uniform distribution, representing a reliable alternative to the uniform or more sophisticated adaptive procedures proposed by seismic codes.

Colajanni, P.; Potenzone, B. [Dipartimento di Ingegneria Civile, Universita di Messina, Contrada Di Dio, S. Agata, 98166 Messina (Italy)

2008-07-08

173

Quantitative analysis of tritium distribution in austenitic stainless steels welds

NASA Astrophysics Data System (ADS)

Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay.

Roustila, A.; Kuromoto, N.; Brass, A. M.; Chêne, J.

1994-08-01

174

A comprehensive study of distribution laws for the fragments of Košice meteorite

NASA Astrophysics Data System (ADS)

In this study, we conduct a detailed analysis of the Košice meteorite fall (February 28, 2010), to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Košice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady, and bimodal lognormal distributions are found to be the most appropriate for describing the Košice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential, and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Košice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 and 9 kg, respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the two heaviest pieces of 2.374 kg and 2.167 kg with the mean around 140 g. Based on our investigations, we conclude that two to three larger fragments of 500-1000 g each should exist, but were either not recovered or not reported by illegal meteorite hunters.

Gritsevich, Maria; Vinnikov, Vladimir; Kohout, TomáÅ.¡; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

2014-03-01

175

HammerCloud: A Stress Testing System for Distributed Analysis

NASA Astrophysics Data System (ADS)

Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

2011-12-01

176

Comparative sensitivity analysis of four distributed erosion models

NASA Astrophysics Data System (ADS)

Using a previously defined framework, we performed a comparative sensitivity analysis of four very different distributed erosion models (MHYDAS, STREAM, PESERA, and MESALES). We investigated their sensitivities to input fluxes, hydrological submodels, and specific erosion parameters gathered into equivalent slope and equivalent erodibility for each model, thus allowing explicit comparisons between models. Tests involved multiple combinations of rain intensities and runoff conditions for selected screenings of the equivalent parameter space, resorting to one-at-a-time displacements and Latin hypercube samples. Sensitivity to spatial distributions of erosion parameters was calculated as a normalized index of numerical spread of soil loss results, obtained at the outlet of a nine-cell virtual catchment endowed with a fixed flow pattern. Spatially homogeneous or distributed parameterizations yielded responses of comparable magnitudes. Equivalent erodibility was often the key parameter, while sensitivity trends depended on input fluxes and the propensity of soils for runoff, affecting continuous and discrete models in clearly dissimilar ways.

Cheviron, B.; Le Bissonnais, Y.; Desprats, J. F.; Couturier, A.; Gumiere, S. J.; Cerdan, O.; Darboux, F.; Raclot, D.

2011-01-01

177

Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146

Maguire, Kelly; Sheriff, Glenn

2011-01-01

178

Sensitivity analysis of distributed erosion models - Application to four models

NASA Astrophysics Data System (ADS)

We applied a previously-defined framework [1] for sensitivity analysis to four very different distributed erosion models (MHYDAS, STREAM, PESERA, MESALES). We investigated their sensitivities to input fluxes, hydrological sub-models and specific erosion parameters gathered into equivalent slope and equivalent erodibility. Tests involved multiple combinations of rain intensities and runoff conditions in addition to selected screenings of the equivalent parameter space, resorting to one-at-a-time displacements and Latin-Hypercube samples. Sensitivity to spatial distributions of erosion parameters was calculated as an index of numerical spread of soil loss results, obtained at the outlet of a nine-cell virtual catchment endowed with a fixed flow chart. Spatially-homogeneous or distributed parameterizations yielded soil loss of comparable magnitudes. Models were more sensitive to equivalent erodibility than to equivalent slope, while each model had sensitivity trends varying with input fluxes and the propensity of soils to runoff. [1] Cheviron et al. (2010), Sensitivity analysis of distributed erosion models - Framework, Water Resources Research, accepted.

Cheviron, Bruno; Le Bissonnais, Yves; Desprats, Jean-François; Couturier, Alain; José Gumiere, Silvio; Cerdan, Olivier; Darboux, Frédéric; Raclot, Damien

2010-05-01

179

A Study of ATLAS Grid Performance for Distributed Analysis

NASA Astrophysics Data System (ADS)

In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

Panitkin, Sergey; Fine, Valery; Wenaus, Torre

2012-12-01

180

Distributed computing finite element electromagnetic analysis of traveling wave tubes

Purpose – The paper's aim is to focus on the utilization of the GRID distributed computing environment in order to reduce simulation time for parameter studies of travelling wave tube (TWT) electron guns and helix slow-wave structures. Design\\/methodology\\/approach – Two TWT finite-element analysis modules were adapted to be run on the GRID, for this purpose scripts were written to submit

Salvatore Coco; Antonino Laudani; Giuseppe Pollicino

2008-01-01

181

Automatic analysis of attack data from distributed honeypot network

NASA Astrophysics Data System (ADS)

There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

2013-05-01

182

Global sensitivity analysis in wind energy assessment

NASA Astrophysics Data System (ADS)

Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.

Tsvetkova, O.; Ouarda, T. B.

2012-12-01

183

Distribution System Reliability Analysis for Smart Grid Applications

NASA Astrophysics Data System (ADS)

Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

Aljohani, Tawfiq Masad

184

Reliability analysis of a structural ceramic combustion chamber

NASA Technical Reports Server (NTRS)

The Weibull modulus, fracture toughness and thermal properties of a silicon nitride material used to make a gas turbine combustor were experimentally measured. The location and nature of failure origins resulting from bend tests were determined with fractographic analysis. The measured Weibull parameters were used along with thermal and stress analysis to determine failure probabilities of the combustor with the CARES design code. The effect of data censoring, FEM mesh refinement, and fracture criterion were considered in the analysis.

Salem, Jonathan A.; Manderscheid, Jane M.; Freedman, Marc R.; Gyekenyesi, John P.

1990-01-01

185

Breakdown Probability Distribution and Equi-Probabilistic VT Characteristics of Transformer Oil

Breakdown probability distributions of transformer oil are investigated by applying repeatedly ac voltages with various durations. It is suggested from the test results that the Weibull distribution defined by the shape parameter of ca. 8 fits the breakdown probability distribution of oil. The breakdown probability distributions of oil do not seem to depend on the stressing time.

M. Ikeda; S. Menju

1979-01-01

186

Componential distribution analysis of food using near infrared ray image

NASA Astrophysics Data System (ADS)

The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

2008-11-01

187

Stochastic sensitivity analysis and kernel inference via distributional data.

Cellular processes are noisy due to the stochastic nature of biochemical reactions. As such, it is impossible to predict the exact quantity of a molecule or other attributes at the single-cell level. However, the distribution of a molecule over a population is often deterministic and is governed by the underlying regulatory networks relevant to the cellular functionality of interest. Recent studies have started to exploit this property to infer network states. To facilitate the analysis of distributional data in a general experimental setting, we introduce a computational framework to efficiently characterize the sensitivity of distributional output to changes in external stimuli. Further, we establish a probability-divergence-based kernel regression model to accurately infer signal level based on distribution measurements. Our methodology is applicable to any biological system subject to stochastic dynamics and can be used to elucidate how population-based information processing may contribute to organism-level functionality. It also lays the foundation for engineering synthetic biological systems that exploit population decoding to more robustly perform various biocomputation tasks, such as disease diagnostics and environmental-pollutant sensing. PMID:25185560

Li, Bochong; You, Lingchong

2014-09-01

188

Journal of Multivariate Analysis 74, 49 68 (2000) Asymptotic Normality of Posterior Distributions 12, 1997 We study consistency and asymptotic normality of posterior distributions of the natural certain growth restrictions on the dimension, we show that the posterior distributions concentrate

Ghoshal, Subhashis

189

Performance Analysis of an Actor-Based Distributed Simulation

NASA Technical Reports Server (NTRS)

Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

Schoeffler, James D.

1998-01-01

190

Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market. PMID:25095276

Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

2014-07-29

191

Prediction of the Inert Strength Distribution of Si3N4 Diesel Valves

Censored Weibull strength distributions were generated with NT551 silicon nitride four-point flexure data using the ASTM C1161-B and 5.0 mm diameter cylindrical specimens. Utilizing finite element models and AlliedSignal's life prediction codes, the inert or fast fracture strength failure probability of a ceramic diesel valve was estimated from these data sets. The failure probability prediction derived from each data set were found to be more conservative than valve strength data. Fractographic analysis of the test specimens and valves showed that the cylindrical specimens failed from a different flaw population than the prismatic flexure bars and the valves. The study emphasizes the prerequisite of having coincident flaw populations homogeneously distributed in both the test specimen and the ceramic component. Lastly, it suggests that unless material homogeneity exists, that any meaningful life prediction or reliability analysis of a component may not be possible.

Andrews, M.J.; Breder, K.; Wereszczak, A.A.

1999-01-25

192

Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

NASA Astrophysics Data System (ADS)

Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ?-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more insight into parameter sensitivity and the conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization provide an alternative way for future MOBIDIC modeling.

Yang, J.; Castelli, F.; Chen, Y.

2014-10-01

193

Time series power flow analysis for distribution connected PV generation.

Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating potential PV impacts.

Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J. [Georgia Institute of Technology, Atlanta, GA; Smith, Jeff [Electric Power Research Institute, Knoxville, TN; Dugan, Roger [Electric Power Research Institute, Knoxville, TN

2013-01-01

194

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Patrick Rice; Jim Harrington

2009-01-23

195

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Rice, Patrick

2009-01-01

196

Analysis of dilepton angular distributions in a parity breaking medium

NASA Astrophysics Data System (ADS)

We investigate how local parity breaking due to large topological fluctuations may affect hadron physics. A modified dispersion relation is derived for the lightest vector mesons ? and ?. They exhibit a mass splitting depending on their polarization. We present a detailed analysis of the angular distribution associated to the lepton pairs created from these mesons searching for polarization dependencies. We propose two angular variables that carry information related to the parity breaking effect. Possible signatures for experimental detection of local parity breaking that could potentially be seen by the PHENIX and STAR collaborations are discussed.

Andrianov, A. A.; Andrianov, V. A.; Espriu, D.; Planells, X.

2014-08-01

197

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Harrington, Jim W [Los Alamos National Laboratory; Rice, Patrick R [Los Alamos National Laboratory

2008-01-01

198

Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters

1 Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters Sriram power distribution system of a next generation transport aircraft is addressed. Detailed analysis with the analysis of subsystem integration in power distribution systems of next generation transport aircraft

Lindner, Douglas K.

199

Analysis of Fuel Ethanol Transportation Activity and Potential Distribution Constraints

This paper provides an analysis of fuel ethanol transportation activity and potential distribution constraints if the total 36 billion gallons of renewable fuel use by 2022 is mandated by EPA under the Energy Independence and Security Act (EISA) of 2007. Ethanol transport by domestic truck, marine, and rail distribution systems from ethanol refineries to blending terminals is estimated using Oak Ridge National Laboratory s (ORNL s) North American Infrastructure Network Model. Most supply and demand data provided by EPA were geo-coded and using available commercial sources the transportation infrastructure network was updated. The percentage increases in ton-mile movements by rail, waterways, and highways in 2022 are estimated to be 2.8%, 0.6%, and 0.13%, respectively, compared to the corresponding 2005 total domestic flows by various modes. Overall, a significantly higher level of future ethanol demand would have minimal impacts on transportation infrastructure. However, there will be spatial impacts and a significant level of investment required because of a considerable increase in rail traffic from refineries to ethanol distribution terminals.

Das, Sujit [ORNL; Peterson, Bruce E [ORNL; Chin, Shih-Miao [ORNL

2010-01-01

200

Data intensive high energy physics analysis in a distributed cloud

NASA Astrophysics Data System (ADS)

We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

2012-02-01

201

Distributed Principal Component Analysis for Wireless Sensor Networks

The Principal Component Analysis (PCA) is a data dimensionality reduction tech-nique well-suited for processing data from sensor networks. It can be applied to tasks like compression, event detection, and event recognition. This technique is based on a linear trans-form where the sensor measurements are projected on a set of principal components. When sensor measurements are correlated, a small set of principal components can explain most of the measurements variability. This allows to significantly decrease the amount of radio communication and of energy consumption. In this paper, we show that the power iteration method can be distributed in a sensor network in order to compute an approximation of the principal components. The proposed implementation relies on an aggregation service, which has recently been shown to provide a suitable framework for distributing the computation of a linear transform within a sensor network. We also extend this previous work by providing a detailed analysis of the computational, memory, and communication costs involved. A com-pression experiment involving real data validates the algorithm and illustrates the tradeoffs between accuracy and communication costs.

Le Borgne, Yann-Aël; Raybaud, Sylvain; Bontempi, Gianluca

2008-01-01

202

Cartographic system for spatial distribution analysis of corneal endothelial cells.

A combined cartographic and morphometric endothelium analyser has been developed by integrating the HISTO 2000 histological imaging and analysis system with a prototype human corneal endothelium analyser. The complete system allows the elaboration and analysis of cartographies of corneal endothelial tissue, and hence the in vitro study of the spatial distribution of corneal endothelial cells, according to their regional morphometric characteristics (cell size and polygonality). The global cartographic reconstruction is obtained by sequential integration of the data analysed for each microscopic field. Subsequently, the location of each microscopically analysed field is referred to its real position on the histologic preparation by means of X-Y co-ordinates; both are provided by micrometric optoelectronic sensors installed on the optical microscope stage. Some cartographies of an excised human corneal keratoconus button in vitro are also presented. These cartographic images allow a macroscopic view of endothelial cells analysed microscopically. Parametric colour images show the spatial distribution of endothelial cells, according to their specific morphometric parameters, and exhibit the variability in size and cellular shape which depend on the analysed area. PMID:7967808

Corkidi, G; Márquez, J; García-Ruiz, M; Díaz-Cintra, S; Graue, E

1994-07-01

203

A theoretical analysis of basin-scale groundwater temperature distribution

NASA Astrophysics Data System (ADS)

The theory of regional groundwater flow is critical for explaining heat transport by moving groundwater in basins. Domenico and Palciauskas's (1973) pioneering study on convective heat transport in a simple basin assumed that convection has a small influence on redistributing groundwater temperature. Moreover, there has been no research focused on the temperature distribution around stagnation zones among flow systems. In this paper, the temperature distribution in the simple basin is reexamined and that in a complex basin with nested flow systems is explored. In both basins, compared to the temperature distribution due to conduction, convection leads to a lower temperature in most parts of the basin except for a small part near the discharge area. There is a high-temperature anomaly around the basin-bottom stagnation point where two flow systems converge due to a low degree of convection and a long travel distance, but there is no anomaly around the basin-bottom stagnation point where two flow systems diverge. In the complex basin, there are also high-temperature anomalies around internal stagnation points. Temperature around internal stagnation points could be very high when they are close to the basin bottom, for example, due to the small permeability anisotropy ratio. The temperature distribution revealed in this study could be valuable when using heat as a tracer to identify the pattern of groundwater flow in large-scale basins. Domenico PA, Palciauskas VV (1973) Theoretical analysis of forced convective heat transfer in regional groundwater flow. Geological Society of America Bulletin 84:3803-3814

An, Ran; Jiang, Xiao-Wei; Wang, Jun-Zhi; Wan, Li; Wang, Xu-Sheng; Li, Hailong

2015-03-01

204

CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES

Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably well developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.

S. Bandopadhyay; N. Nagabhushana

2003-10-01

205

Evaluation of Distribution Analysis Software for DER Applications

The term ''Distributed energy resources'' or DER refers to a variety of compact, mostly self-contained power-generating technologies that can be combined with energy management and storage systems and used to improve the operation of the electricity distribution system, whether or not those technologies are connected to an electricity grid. Implementing DER can be as simple as installing a small electric generator to provide backup power at an electricity consumer's site. Or it can be a more complex system, highly integrated with the electricity grid and consisting of electricity generation, energy storage, and power management systems. DER devices provide opportunities for greater local control of electricity delivery and consumption. They also enable more efficient utilization of waste heat in combined cooling, heating and power (CHP) applications--boosting efficiency and lowering emissions. CHP systems can provide electricity, heat and hot water for industrial processes, space heating and cooling, refrigeration, and humidity control to improve indoor air quality. DER technologies are playing an increasingly important role in the nation's energy portfolio. They can be used to meet base load power, peaking power, backup power, remote power, power quality, as well as cooling and heating needs. DER systems, ranging in size and capacity from a few kilowatts up to 50 MW, can include a number of technologies (e.g., supply-side and demand-side) that can be located at or near the location where the energy is used. Information pertaining to DER technologies, application solutions, successful installations, etc., can be found at the U.S. Department of Energy's DER Internet site [1]. Market forces in the restructured electricity markets are making DER, both more common and more active in the distribution systems throughout the US [2]. If DER devices can be made even more competitive with central generation sources this trend will become unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of this modeling effort.

Staunton, RH

2003-01-23

206

Accelerated Life Testing Model for a Generalized Birnbaum-Saunders Distribution

-S distribution in order to estimate reliability at normal operating conditions. A. Birnbaum-Saunders Life ModelAccelerated Life Testing Model for a Generalized Birnbaum-Saunders Distribution Yao Cheng and E. A modeled by Birnbaum-Saunders (B-S) and Weibull distributions. Sometimes, materials with high cycle fatigue

Boyer, Edmond

207

Timing and Schedulability Analysis for Distributed Automotive Control Applications

High-end cars today consist of more than 100 electronic control units (ECUs) that are connected to a set of sensors and actuators and run multiple distributed control applications. The design flow of such architectures consists of specifying control applications as Simulink/Stateflow models, followed by generating code from them and finally mapping such code onto multiple ECUs. In addition, the scheduling policies and parameters on both the ECUs and the communication buses over which they communicate also need to be specified. These policies and parameters are computed from high-level timing and control performance constraints. The proposed tutorial will cover different aspects of this design flow, with a focus on timing and schedulability problems. After reviewing the basic concepts of worst-case execution time analysis and schedulability analysis, we will discuss the differences between meeting timing constraints (as in classical real-time systems) and meeting control performance constraints (e.g., stability, steady and transient state performance). We will then describe various control performance related schedulability analysis techniques and how they may be tied to model-based software development. Finally, we will discuss various schedule synthesis techniques, both for ECUs as well as for communication protocols like FlexRay, so that control performance constraints specified at the model-level may be satisfied. Throughout the tutorial different commercial as well as academic tools will be discussed and demonstrated.

Samarjit Chakraborty; Martin Lukasiewycz; Marco Di Natale; Heiko Falk; Frank Slomka

208

A step increasing strain accelerated fatigue test has been developed and validated for the evaluation of candidate elastomeric materials for the artificial heart program. Whereas standard fatigue tests can be approximated by a log-normal or Weibull distribution, the increasing strain accelerated fatigue test has the general appearance of being normally distributed (i.e., a Gaussian distribution). The hypothesis that the data is indeed normally distributed was examined using a variety of statistical tests. The mean and median were equivalent in all data sets compared, as they would be for normally distributed data. There was very little positive or negative skew found in data collected under a wide variety of conditions. The data was found to have a slightly stronger than expected central tendency (positive kurtosis), but most of this disappeared when the data were normalized. Chi-squared analysis found normally distributed data in most subset of the data except for those with small numbers of test specimens per test. Normalized test data was not found to differ significantly from a Gaussian distribution by the Kolmogorov-Smirnov test. It therefore appears that increasing strain accelerated fatigue test data can be approximated by a normal distribution. This allows for easy data interpretation and aids in the extrapolation of incomplete data sets. PMID:1842511

McMillin, C R

1991-01-01

209

SATMC: Spectral energy distribution Analysis Through Markov Chains

NASA Astrophysics Data System (ADS)

We present the general purpose spectral energy distribution (SED) fitting tool SED Analysis Through Markov Chains (SATMC). Utilizing Monte Carlo Markov Chain (MCMC) algorithms, SATMC fits an observed SED to SED templates or models of the user's choice to infer intrinsic parameters, generate confidence levels and produce the posterior parameter distribution. Here, we describe the key features of SATMC from the underlying MCMC engine to specific features for handling SED fitting. We detail several test cases of SATMC, comparing results obtained from traditional least-squares methods, which highlight its accuracy, robustness and wide range of possible applications. We also present a sample of submillimetre galaxies (SMGs) that have been fitted using the SED synthesis routine GRASIL as input. In general, these SMGs are shown to occupy a large volume of parameter space, particularly in regards to their star formation rates which range from ˜30 to 3000 M? yr-1 and stellar masses which range from ˜1010 to 1012 M?. Taking advantage of the Bayesian formalism inherent to SATMC, we also show how the fitting results may change under different parametrizations (i.e. different initial mass functions) and through additional or improved photometry, the latter being crucial to the study of high-redshift galaxies.

Johnson, S. P.; Wilson, G. W.; Tang, Y.; Scott, K. S.

2013-12-01

210

A Distributed Flocking Approach for Information Stream Clustering Analysis

Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

2006-01-01

211

A meta-analysis of parton distribution functions

NASA Astrophysics Data System (ADS)

A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+? s uncertainty at a common QCD coupling strength of 0.118.

Gao, Jun; Nadolsky, Pavel

2014-07-01

212

Phylogenetic analysis reveals a scattered distribution of autumn colours

Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636

Archetti, Marco

2009-01-01

213

Weibull statistical analysis of area effect on the breakdown strength in polymer films

The area and film thickness effects on the breakdown strength were examined for different polymer films at room temperature. Electrodes of four different diameters were used for this experimental work, which were in the range of 1\\/2 to 2 inches in diameter. Materials of various thickness used for this investigation were aramid paper (NOMEX type 410), Polyimide film (KAPTON), Mylar

Saeed UI-Haq; G. R. G. Raju

2002-01-01

214

Statistical analysis and modelling of small satellite reliability

NASA Astrophysics Data System (ADS)

This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

Guo, Jian; Monas, Liora; Gill, Eberhard

2014-05-01

215

Statistical distribution of mechanical properties for three graphite-epoxy material systems

NASA Technical Reports Server (NTRS)

Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

Reese, C.; Sorem, J., Jr.

1981-01-01

216

Crystal size distribution analysis of plagioclase in experimentally decompressed hydrous rhyodacite November 2010 Editor: T.M. Harrison Keywords: crystal size distribution plagioclase decompression) of plagioclase forming during decompression experiments of hydrous rhyodacite magma. Samples were annealed at 130

Hammer, Julia Eve

217

Novel physical interpretations of K-distributed reverberation

Interest in describing and modeling envelope distributions of sea-floor backscatter has increased recently, particularly with regard to high-resolution active sonar systems. Sea-floor scattering that results in heavy-tailed-matched-filter-envelope probability distribution functions (i.e., non-Rayleigh distributions exemplified by the K, Weibull, Rayleigh mixture, or log-normal distributions) is often the limiting factor in the performance of these types of sonar systems and in this

Douglas A. Abraham; Anthony P. Lyons

2002-01-01

218

Response Time Analysis for Distributed Real-Time Systems with Bursty Job Arrivals

Response Time Analysis for Distributed Real-Time Systems with Bursty Job Arrivals Chengzhi Li schedulability analysis methodology for distributed hard real-time systems with bursty job arrivals. If the job set in the system is static, design-time analysis validates that no tim- ing constraints

Bettati, Riccardo

219

Response Time Analysis for Distributed RealTime Systems with Bursty Job Arrivals

Response Time Analysis for Distributed RealÂTime Systems with Bursty Job Arrivals Chengzhi Li schedulability analysis methodology for distributed hard realÂtime systems with bursty job arrivals. If the job set in the system is static, designÂtime analysis validates that no timÂ ing constraints

Bettati, Riccardo

220

Condition analysis of overhead power distribution system insulators using combined support vector machine (SVM) and wavelet multi-resolution analysis (MRA) seems to be promising for distribution system monitoring (DSM) automation to cope with the increasing system complexity. Though system well-being analysis for engineering applications has been used mostly for electric power system reliability studies, the same principle has been extended for

Velaga Sreerama Murthy; K. Tarakanath; D. K. Mohanta; Sumit Gupta

2010-01-01

221

Stability Analysis of Distributed Order Fractional Chen System

We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

2013-01-01

222

Distribution and Phylogenetic Analysis of Family 19 Chitinases in Actinobacteria

In organisms other than higher plants, family 19 chitinase was first discovered in Streptomyces griseus HUT6037, and later, the general occurrence of this enzyme in Streptomyces species was demonstrated. In the present study, the distribution of family 19 chitinases in the class Actinobacteria and the phylogenetic relationship of Actinobacteria family 19 chitinases with family 19 chitinases of other organisms were investigated. Forty-nine strains were chosen to cover almost all the suborders of the class Actinobacteria, and chitinase production was examined. Of the 49 strains, 22 formed cleared zones on agar plates containing colloidal chitin and thus appeared to produce chitinases. These 22 chitinase-positive strains were subjected to Southern hybridization analysis by using a labeled DNA fragment corresponding to the catalytic domain of ChiC, and the presence of genes similar to chiC of S. griseus HUT6037 in at least 13 strains was suggested by the results. PCR amplification and sequencing of the DNA fragments corresponding to the major part of the catalytic domains of the family 19 chitinase genes confirmed the presence of family 19 chitinase genes in these 13 strains. The strains possessing family 19 chitinase genes belong to 6 of the 10 suborders in the order Actinomycetales, which account for the greatest part of the Actinobacteria. Phylogenetic analysis suggested that there is a close evolutionary relationship between family 19 chitinases found in Actinobacteria and plant class IV chitinases. The general occurrence of family 19 chitinase genes in Streptomycineae and the high sequence similarity among the genes found in Actinobacteria suggest that the family 19 chitinase gene was first acquired by an ancestor of the Streptomycineae and spread among the Actinobacteria through horizontal gene transfer. PMID:14766598

Kawase, Tomokazu; Saito, Akihiro; Sato, Toshiya; Kanai, Ryo; Fujii, Takeshi; Nikaidou, Naoki; Miyashita, Kiyotaka; Watanabe, Takeshi

2004-01-01

223

Slack Bus Modeling and Cost Analysis of Distributed Generator Installations

The installation and operation of distributed generators DGs has great potential for local utilities to improve distribution system reliability and lower their operating and expansion planning costs. To evaluate this potential, distribution system analyses must reflect its new operating environment with significant DG. Resulting tools can be utilized by both utilities and DG owners to improve their decision making algorithms.

Shiqiong Tong; Karen Miu

2007-01-01

224

Bivariate extreme value distributions

NASA Technical Reports Server (NTRS)

In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

Elshamy, M.

1992-01-01

225

Modeling Crown Structure from LiDAR Data with Statistical Distributions

Modeling Crown Structure from LiDAR Data with Statistical Distributions Quang V. Cao and Thomas J for data near the base of the crown, the mixture distribution seemed to better resemble the shape of the live crown) starts at zero, the other end of the Weibull function (the bottom of the live crown) could

Cao, Quang V.

226

SPATIAL DISTRIBUTION AND GEOMORPHIC CONDITION OF FISH HABITAT IN STREAMS: AN ANALYSIS USING

SPATIAL DISTRIBUTION AND GEOMORPHIC CONDITION OF FISH HABITAT IN STREAMS: AN ANALYSIS USING species using detailed two-dimensional hydraulic models and spatial analysis techniques (semi patches of stream channel habitat. Copyright # 2008 John Wiley & Sons, Ltd. key words: geomorphology

Vermont, University of

227

Causality and sensitivity analysis in distributed design simulation

Numerous collaborative design frameworks have been developed to accelerate the product development, and recently environments for building distributed simulations have been proposed. For example, a simulation framework ...

Kim, Jaehyun, 1970-

2002-01-01

228

Pitch angle distribution analysis of radiation belt electrons based on Combined Release of pitch angle distributions (PADs) of energetic electrons is performed. The distributions are classified a is the local pitch angle, a profile of the parameter n versus L-shell is produced for local times corresponding

Li, Xinlin

229

Integration of distributed generators into distribution three-phase load flow analysis

An unbalanced three-phase load flow program taking the mathematical models of distributed generators (DGs) into account is proposed in this paper to analyze and simulate the penetrations of DGs for distribution systems. The load flow method based on the direct approach algorithm is employed in this paper. The mathematical models of DGs are then developed and integrated into the proposed

Jen-Hao Teng

2005-01-01

230

The traditional approach in an electrical power system is to have centralized large capacity power plants feeding power to distant load centers through an extensive transmission and distribution network. DG emerged as an alternative to upgrade transmission lines and increase the capacity of remote power plants. The connection of DG to a Distribution system meets the various challenges like power

S. S. Darly; P. Vanaja Ranjan; K. V. Bindu; A. Srikrishnan; P. R. Krishnan; B. J. Rabi

2010-01-01

231

Statistical analysis of the electrical breakdown time delay distributions in krypton

The statistical analysis of the experimentally observed electrical breakdown time delay distributions in the krypton-filled diode tube at 2.6 mbar is presented. The experimental distributions are obtained on the basis of 1000 successive and independent measurements. The theoretical electrical breakdown time delay distribution is evaluated as the convolution of the statistical time delay with exponential, and discharge formative time with Gaussian distribution. The distribution parameters are estimated by the stochastic modelling of the time delay distributions, and by comparing them with the experimental distributions for different relaxation times, voltages, and intensities of UV radiation. The transition of distribution shapes, from Gaussian-type to the exponential-like, is investigated by calculating the corresponding skewness and excess kurtosis parameters. It is shown that the mathematical model based on the convolution of two random variable distributions describes experimentally obtained time delay distributions and the separation of the total breakdown time delay to the statistical and formative time delay.

Maluckov, Cedomir A.; Karamarkovic, Jugoslav P.; Radovic, Miodrag K.; Pejovic, Momcilo M. [Technical Faculty in Bor, University of Belgrade, Vojske Jugoslavije 24, 19210 Bor (Serbia and Montenegro); Faculty of Civil Engineering and Architecture, University of Nis, Beogradska 14, 18000 Nis (Serbia and Montenegro); Faculty of Sciences and Mathematics, University of Nis, P.O. Box 224, 18001 Nis (Serbia and Montenegro); Faculty of Electronic Engineering, University of Nis, P.O. Box 73, 18001 Nis (Serbia and Montenegro)

2006-08-15

232

A global analysis of root distributions for terrestrial biomes

Understanding and predicting ecosystem functioning (e.g., carbon and water fluxes) and the role of soils in carbon storage requires an accurate assessment of plant rooting distributions. Here, in a comprehensive literature synthesis, we analyze rooting patterns for terrestrial biomes and compare distributions for various plant functional groups. We compiled a database of 250 root studies, subdividing suitable results into 11

R. B. Jackson; J. Canadell; J. R. Ehleringer; H. A. Mooney; O. E. Sala; E. D. Schulze

1996-01-01

233

Analysis Model for Domestic Hot Water Distribution Systems: Preprint

A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.

Maguire, J.; Krarti, M.; Fang, X.

2011-11-01

234

Network secondary distribution system fault current analysis and application

Network secondary distribution systems are typically designed to ensure sufficient current is available to clear the most severe type of fault which is the solid fault. The procedure presented in this paper analyzes a network secondary distribution system for the expected performance of secondary cable circuits during solid type faults. By satisfying the criteria presented in this paper, it is

Daniel J. Mungovan; David R. Smith

2011-01-01

235

Monotone Log-Odds Rate Distributions in Reliability Analysis

Monotone failure rate models [Barlow Richard, E., Marshall, A. W., Proschan, Frank. (1963). Properties of probability distributions with monotone failure rate. Annals of Mathematical Statistics 34:375–389, and Barlow Richard, E., Proschan, Frank. (1965). Mathematical Theory of Reliability. New York: John Wiley & Sons, Barlow Richard, E., Proschan, Frank. (1966a). Tolerance and confidence limits for classes of distributions based on failure

Yao Wang; Anwar M. Hossain; William J. Zimmer

2003-01-01

236

Sampling error in the bending strength distribution of dimension lumber

Summary Information is presented on the magnitude of errors associated with various sampling simulation schemes of the distribution\\u000a of three different populations, representing actual bending strength of dimension lumber. Errors were determined between the\\u000a simulated and actual distributions. Graphical evaluations indicated good fits with the three-parameter form of the weibull\\u000a distribution for both original and simulated bending strength data, as well

P. J. Pellicane; J. Bodig

1981-01-01

237

Analysis of soil carbon transit times and age distributions using network theories

Analysis of soil carbon transit times and age distributions using network theories Stefano Manzoni be approximated by networks of linear compartments, permitting theoretical analysis of transit time (i systems, and models assuming a continuous distribution of decay constants. We also derive the transit time

Katul, Gabriel

238

Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study

Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made) Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study. PLoS ONE 7

Jalali. Bahram

239

DATALITE: a Distributed Architecture for Traffic Analysis via Light-weight Traffic Digest

DATALITE: a Distributed Architecture for Traffic Analysis via Light-weight Traffic Digest for Traffic Analysis via LIght- weight Traffic digEst, which introduces a set of new distributed algorithms digests (TD's) amongst the network nodes. A TD for N packets only requires O(loglog N) bits of memory

Chao, Jonathan

240

CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

NASA Technical Reports Server (NTRS)

This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

2003-01-01

241

CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

NASA Astrophysics Data System (ADS)

This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

2003-02-01

242

CO? concentrations recorded for two years using a Picarro G1301 analyser at a rural site were studied applying two procedures. Firstly, the smoothing kernel method, which to date has been used with one linear and another circular variable, was used with pairs of circular variables: wind direction, time of day, and time of year, providing that the daily cycle was the prevailing cyclical evolution and that the highest concentrations were justified by the influence of one nearby city source, which was only revealed by directional analysis. Secondly, histograms were obtained, and these revealed most observations to be located between 380 and 410 ppm, and that there was a sharp contrast during the year. Finally, histograms were fitted to 14 distributions, the best known using analytical procedures, and the remainder using numerical procedures. RMSE was used as the goodness of fit indicator to compare and select distributions. Most functions provided similar RMSE values. However, the best fits were obtained using numerical procedures due to their greater flexibility, the triangular distribution being the simplest function of this kind. This distribution allowed us to identify directions and months of noticeable CO? input (SSE and April-May, respectively) as well as the daily cycle of the distribution symmetry. Among the functions whose parameters were calculated using an analytical expression, Erlang distributions provided satisfactory fits for monthly analysis, and gamma for the rest. By contrast, the Rayleigh and Weibull distributions gave the worst RMSE values. PMID:23602977

Pérez, Isidro A; Sánchez, M Luisa; García, M Ángeles; Pardo, Nuria

2013-07-01

243

Analysis of temperature distribution in liquid-cooled turbine blades

NASA Technical Reports Server (NTRS)

The temperature distribution in liquid-cooled turbine blades determines the amount of cooling required to reduce the blade temperature to permissible values at specified locations. This report presents analytical methods for computing temperature distributions in liquid-cooled turbine blades, or in simplified shapes used to approximate sections of the blade. The individual analyses are first presented in terms of their mathematical development. By means of numerical examples, comparisons are made between simplified and more complete solutions and the effects of several variables are examined. Nondimensional charts to simplify some temperature-distribution calculations are also given.

Livingood, John N B; Brown, W Byron

1952-01-01

244

Analysis of fire causes spatial and temporal distribution in France

NASA Astrophysics Data System (ADS)

The goal of the paper was to create a statistical model explaining spatial and temporal occurrences of forest fires depending on their causes. In the forest fire causes databases, fire ignitions were located according to the third level of the 2003 Nomenclature of Territorial Units for Statistics (NUTS 3). 15,469 records were considered on the 2005 - 2008 period on the French territory. Global fire ignition density as well as fire ignition cause densities related to lightning, negligence and arson were considered. Descriptive variables (land cover, topography and climate) were used to divide the whole country into homogeneous regions. According to a clustering based on multidimensional projection (Sammon's projection), NUTS 3 presenting the nearest characteristics in terms of land cover, topography or climate conditions were merged into regions. The analysis of these variables led to 3 regions: the northwest France, the eastern central France and the Mediterranean region. In this paper, Partial Least Square regression was performed on each region to identify the main explanatory spatial variables and to model the fire density due to the different causes. 32 explanatory variables relative to human and biophysical variables were used in these analyses. Results of the statistical analyses performed on the spatial distribution of fire density due to the different types of cause in the different French regions showed that: (i) Fire density due to natural cause was mainly favoured by land-cover variables (such as the proportion of overall vegetation, the proportion of shrubland, the surface area of farms) and was mainly mitigated by some agricultural variables (such as proportion of non-irrigated crops or pasture, farm density) ; (ii) Fire density due to negligence was mainly favoured by network and socio-economic variables and was mainly mitigated by land-cover and climate variables depending on the region ; (iii) Fire density due to arson was mainly favoured by network, topographic and socio-economic variables and was mainly mitigated by climate variables depending on the region. Causes due to negligence or arson were maybe too global and to get better results, more detailed causes may be used. Moreover, in most works, the statistical analyses were carried out on georeferenced fire ignition points allowing the use of more accurate explanatory variables such as the distance to the road, distance to the forest, etc.

Long, M.; Ganteaume, A.; Jappiot, M.; Andrienko, G.; Andrienko, N.

2012-04-01

245

Numerical Analysis of a Cold Air Distribution System

Cold air distribution systems may reduce the operating energy consumption of air-conditioned air supply system and improve the outside air volume percentages and indoor air quality. However, indoor temperature patterns and velocity field are easily...

Zhu, L.; Li, R.; Yuan, D.

2006-01-01

246

Rapid Spatial Distribution Seismic Loss Analysis for Multistory Buildings

approach is thus recommended that incorporates the effects of spatial distribution of earthquake induced damage to frame buildings. Moreover, the approach aims to discriminate between required repair and replacement damages. Suites of earthquakes...

Deshmukh, Pankaj Bhagvatrao

2012-07-16

247

Determination analysis of energy conservation standards for distribution transformers

This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

1996-07-01

248

Analysis of floc size distributions in a mixing tank

The aim of this work is to analyse the relation between floc size distribution and hydrodynamics in a mixing tank. The objectives are to answer the two following questions: (1) Does a steady state exist in flocculation? (2) Do the floc size distributions depend on the impeller? Flocculation experiments are realised for fixed physico-chemical conditions (pH 3.5) in a standardized

Carole Coufort; Claire Dumas; Denis Bouyer; Alain Liné

2008-01-01

249

Thermal Analysis of Antenna Structures. Part 2: Panel Temperature Distribution

NASA Technical Reports Server (NTRS)

This article is the second in a series that analyzes the temperature distribution in microwave antennas. An analytical solution in a series form is obtained for the temperature distribution in a flat plate analogous to an antenna surface panel under arbitrary temperature and boundary conditions. The solution includes the effects of radiation and air convection from the plate. Good agreement is obtained between the numerical and analytical solutions.

Schonfeld, D.; Lansing, F. L.

1983-01-01

250

Reverse-link performance analysis in CDMA distributed antenna systems

An exact reverse-link Eb\\/I0 expression for a multi-cell CDMA generalized distributed antenna system (GDAS) is derived, and the reverse-link outage capacity of various antenna structures is investigated by taking the power control dynamic range into account. Our investigation shows that distributed antenna structures are capacity-enhanced and power-saved in contrast to traditional centralized antenna structures. Furthermore, GDAS is a cost-effective solution

Chen Peng; WU Wei-ling; Su Jie

2005-01-01

251

The temperature dependence of the dynamics of mesophilic and thermophilic dihydrofolate reductase is examined using elastic incoherent neutron scattering. It is demonstrated that the distribution of atomic displacement amplitudes can be derived from the elastic scattering data by assuming a (Weibull) functional form that resembles distributions seen in molecular dynamics simulations. The thermophilic enzyme has a significantly broader distribution than

Lars Meinhold; David Clement; Moeava Tehei; Roy Daniel; John L. Finneye; Jeremy C. Smith

2008-01-01

252

Performance Analysis of Distributed Object-Oriented Applications

NASA Technical Reports Server (NTRS)

The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

Schoeffler, James D.

1998-01-01

253

Income distribution dependence of poverty measure: A theoretical analysis

NASA Astrophysics Data System (ADS)

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the ‘global’ mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy.

Chattopadhyay, Amit K.; Mallick, Sushanta K.

2007-04-01

254

A network analysis of food flows within the United States of America.

The world food system is globalized and interconnected, in which trade plays an increasingly important role in facilitating food availability. We present a novel application of network analysis to domestic food flows within the USA, a country with global importance as a major agricultural producer and trade power. We find normal node degree distributions and Weibull node strength and betweenness centrality distributions. An unassortative network structure with high clustering coefficients exists. These network properties indicate that the USA food flow network is highly social and well-mixed. However, a power law relationship between node betweenness centrality and node degree indicates potential network vulnerability to the disturbance of key nodes. We perform an equality analysis which serves as a benchmark for global food trade, where the Gini coefficient = 0.579, Lorenz asymmetry coefficient = 0.966, and Hoover index = 0.442. These findings shed insight into trade network scaling and proxy free trade and equitable network architectures. PMID:24773310

Lin, Xiaowen; Dang, Qian; Konar, Megan

2014-05-20

255

Analysis and machine mapping of the distribution of band recoveries

A method of calculating distance and bearing from banding site to recovery location based on the solution of a spherical triangle is presented. X and Y distances on an ordinate grid were applied to computer plotting of recoveries on a map. The advantages and disadvantages of tables of recoveries by State or degree block, axial lines, and distance of recovery from banding site for presentation and comparison of the spatial distribution of band recoveries are discussed. A special web-shaped partition formed by concentric circles about the point of banding and great circles at 30-degree intervals through the point of banding has certain advantages over other methods. Comparison of distributions by means of a X? contingency test is illustrated. The statistic V = X?/N can be used as a measure of difference between two distributions of band recoveries and its possible use is illustrated as a measure of the degree of migrational homing.

Cowardin, L.M.

1977-01-01

256

Evaluation of frequency distributions for flood hazard analysis

Many different frequency distributions and fitting methods are used to determine the magnitude and frequency of floods and rainfall. Ten different combinations of frequency distributions and fitting methods are evaluated by summarizing the differences in the 0.002 exceedance probability quantile (500-year event), presenting graphical displays of the 10 estimates of the 0.002 quantile, and performing statistical tests to determine if differences are statistically significant. This evaluation indicated there are some statistically significant differences among the methods but, from an engineering standpoint, these differences may not be significant.

Thomas, Wilbert O., Jr.; Kuriki, Minoru; Suetsugi, Tadashi

1995-01-01

257

Incremental and on-demand random walk for iterative power distribution network analysis

Power distribution networks (PDNs) are designed and analyzed iteratively. Random walk is among the most efficient methods for PDN analysis. We develop in this paper an incre- mental and on-demand random walk to reduce iterative analysis time. During each iteration, we map the design changes as pos- itive or negative random walks for observed nodes. To update PDN analysis result,

Yiyu Shi; Wei Yao; Jinjun Xiong; Lei He

2009-01-01

258

Incremental and on-demand random walk for iterative power distribution network analysis

Power distribution networks (PDNs) are designed and analyzed iteratively. Random walk is among the most efficient methods for PDN analysis. We develop in this paper an incremental and on-demand random walk to reduce iterative analysis time. During each iteration, we map the design changes as positive or negative random walks for observed nodes. To update PDN analysis result, we only

Yiyu Shi; Wei Yao; Jinjun Xiong; Lei He

2009-01-01

259

Preliminary distributional analysis of US endangered bird species

A first exploration of applications of ecological niche modeling and geographic distributional prediction to endangered species protection is developed. Foci of richness of endangered bird species are identified in coastal California and along the southern fringe of the United States. Species included on the Endangered Species List on the basis of peripheral populations inflate these concentrations considerably. Species without protection

MANDALINE E. GODOWN; A. TOWNSEND PETERSON

2000-01-01

260

A Performance Study of Distributed Timed Automata Reachability Analysis

We experimentally evaluate an existing distributed reachability algorithm for timed automata on a Linux Beowulf cluster. It is discovered that the algorithm suffers from load balancing problems and a high communication overhead. The load balancing problems are caused by inclusion checking performed between symbolic states unique to the timed automaton reachability algorithm. We propose adding a proportional load balancing controller

Gerd Behrmann

2002-01-01

261

RAINFALL DATA ANALYSIS USING THE GAMMA DISTRIBUTION FUNCTION

The Gamma distribution function can be useful for fitting rainfall data. n integral part of the assessment of storm loads on water quality is the statistical evaluation of rainfall records. ourly rainfall records of many years duration are cumbersome and difficult to analyze. he ...

262

Analysis of vegetation distribution in Interior Alaska and sensitivity to

distribution of four major vegetation types: tundra, deciduous forest, black spruce forest and white spruce by elevation, precipitation and south to north aspect. At the second step, forest was separated into deciduous temperatures exceeded a critical limit (+2 Â°C). Deciduous forests expand their range the most when any two

McGuire, A. David

263

Distribution and Phylogenetic Analysis of Family 19 Chitinases in Actinobacteria

In organisms other than higher plants, family 19 chitinase was first discovered in Streptomyces griseus HUT6037, and later, the general occurrence of this enzyme in Streptomyces species was demonstrated. In the present study, the distribution of family 19 chitinases in the class Actinobacteria and the phylogenetic rela- tionship of Actinobacteria family 19 chitinases with family 19 chitinases of other organisms

Tomokazu Kawase; Akihiro Saito; Toshiya Sato; Ryo Kanai; Takeshi Fujii; Naoki Nikaidou; Kiyotaka Miyashita; Takeshi Watanabe

2004-01-01

264

Greenhouse Gas Emission Analysis for Distributed Energy System

Greenhouse effect directly affects human health. The literature firstly introduces greenhouse effect and the distributed energy system. Four systems including one natural gas micro-turbine system, one internal combustion biogas engine system, one Phosphoric Acid Fuel Cell combined heat and power system and one Proton Exchange Membrane Fuel Cell system are taken as the targets. Greenhouse gas concentration and emission rate

Zhai Rong-rong; Yang Yong-ping; Duan Li-qiang

2008-01-01

265

THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

266

ANALYSIS OF COARSELY GROUPED DATA FROM THE LOGNORMAL DISTRIBUTION

A missing information technique is applied to blood lead data that is both grouped and assumed to be lognormally distributed. These maximum likelihood techniques are extended from the simple lognormal case to obtain solutions for a general linear model case. Various models are fi...

267

Voltage analysis of distribution systems with DFIG wind turbines

Wind energy is becoming the most viable renewable energy source mainly because of the growing concerns over carbon emissions and uncertainties in fossil fuel supplies and the government policy impetus. The increasing penetration of wind power in distribution systems may significantly affect voltage stability of the systems, particularly during wind turbine cut-in and cut-off disturbances. Currently, doubly fed induction generator

Baohua Dong; Sohrab Asgarpoor; Wei Qiao

2009-01-01

268

Metagenomic Analysis of Water Distribution System Bacterial Communities

The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...

269

Conduits and dike distribution analysis in San Rafael Swell, Utah

NASA Astrophysics Data System (ADS)

Volcanic fields generally consist of scattered monogenetic volcanoes, such as cinder cones and maars. The temporal and spatial distribution of monogenetic volcanoes and probability of future activity within volcanic fields is studied with the goals of understanding the origins of these volcano groups, and forecasting potential future volcanic hazards. The subsurface magmatic plumbing systems associated with volcanic fields, however, are rarely observed or studied. Therefore, we investigated a highly eroded and exposed magmatic plumbing system on the San Rafael Swell (UT) that consists of dikes, volcano conduits and sills. San Rafael Swell is part of the Colorado Plateau and is located east of the Rocky Mountain seismic belt and the Basin and Range. The overburden thickness at the time of mafic magma intrusion (Pliocene; ca. 4 Ma) into Jurassic sandstone is estimated to be ~800 m based on paleotopographical reconstructions. Based on a geologic map by P. Delaney and colleagues, and new field research, a total of 63 conduits are mapped in this former volcanic field. The conduits each reveal features of root zone and / or lower diatremes, including rapid dike expansion, peperite and brecciated intrusive and host rocks. Recrystallized baked zone of host rock is also observed around many conduits. Most conduits are basaltic or shonkinitic with thickness of >10 m and associated with feeder dikes intruded along N-S trend joints in the host rock, whereas two conduits are syenitic and suggesting development from underlying cognate sills. Conduit distribution, which is analyzed by a kernel function method with elliptical bandwidth, illustrates a N-S elongate higher conduit density area regardless of the azimuth of closely distributed conduits alignment (nearest neighbor distance <200 m). In addition, dike density was calculated as total dike length in unit area (km/km^2). Conduit and sill distribution is concordant with the high dike density area. Especially, the distribution of conduits is not random with respect to the dike distribution with greater than 99% confidence on the basis of the Kolmogorov-Smirnov test. On the other hand, dike density at each conduits location also suggests that there is no threshold of dike density for conduit formation. In other words, conduits may be possible to develop from even short mapped dikes in low dike density areas. These results show effectiveness of studying volcanic vent distribution to infer the size of magmatic system below volcanic fields and highlight the uncertainty of forecasting the location of new monogenetic volcanoes in active fields, which may be associated with a single dike intrusion.

Kiyosugi, K.; Connor, C.; Wetmore, P. H.; Ferwerda, B. P.; Germa, A.

2011-12-01

270

NASA Astrophysics Data System (ADS)

As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

Taravat, A.; Del Frate, F.

2013-09-01

271

PanDA: distributed production and distributed analysis system for ATLAS

NASA Astrophysics Data System (ADS)

A new distributed software system was developed in the fall of 2005 for the ATLAS experiment at the LHC. This system, called PANDA, provides an integrated service architecture with late binding of jobs, maximal automation through layered services, tight binding with ATLAS Distributed Data Management system [1], advanced error discovery and recovery procedures, and other features. In this talk, we will describe the PANDA software system. Special emphasis will be placed on the evolution of PANDA based on one and half year of real experience in carrying out Computer System Commissioning data production [2] for ATLAS. The architecture of PANDA is well suited for the computing needs of the ATLAS experiment, which is expected to be one of the first HEP experiments to operate at the petabyte scale.

Maeno, T.

2008-07-01

272

In this paper, the element incidence matrix has been extended to develop a comprehensive three-phase distribution system power flow program for radial topology. Three-phase overhead or underground primary feeders and double-phase or single-phase line sections near the end of the feeder laterals have been considered. Unbalanced loads with different types including constant power, constant current and constant impedance are modeled

Hany E. Farag; E. F. El-Saadany; Ramadan El Shatshat; Aboelsood Zidan

2011-01-01

273

NASA Astrophysics Data System (ADS)

The paper considers the problem of the analysis of distributed computer systems reliability with client-server architecture. A distributed computer system is a set of hardware and software for implementing the following main functions: processing, storage, transmission and data protection. This paper discusses the distributed computer systems architecture "client-server". The paper presents the scheme of the distributed computer system functioning represented as a graph where vertices are the functional state of the system and arcs are transitions from one state to another depending on the prevailing conditions. In reliability analysis we consider such reliability indicators as the probability of the system transition in the stopping state and accidents, as well as the intensity of these transitions. The proposed model allows us to obtain correlations for the reliability parameters of the distributed computer system without any assumptions about the distribution laws of random variables and the elements number in the system.

Kovalev, I. V.; Zelenkov, P. V.; Karaseva, M. V.; Tsarev, M. Yu; Tsarev, R. Yu

2015-01-01

274

Exploring Vector Fields with Distribution-based Streamline Analysis

Streamline-based techniques are designed based on the idea that properties of streamlines are indicative of features in the underlying field. In this paper, we show that statistical distributions of measurements along the trajectory of a streamline can be used as a robust and effective descriptor to measure the similarity between streamlines. With the distribution-based approach, we present a framework for interactive exploration of 3D vector fields with streamline query and clustering. Streamline queries allow us to rapidly identify streamlines that share similar geometric features to the target streamline. Streamline clustering allows us to group together streamlines of similar shapes. Based on users selection, different clusters with different features at different levels of detail can be visualized to highlight features in 3D flow fields. We demonstrate the utility of our framework with simulation data sets of varying nature and size.

Lu, Kewei; Chaudhuri, Abon; Lee, Teng-Yok; Shen, Han-Wei; Wong, Pak C.

2013-02-26

275

Modeling and Analysis of Power Distribution Networks for Gigabit Applications

As the operating frequency of digital systems increases and voltage swing decreases, it becomes increasingly important to accurately characterize and analyze power distribution networks (PDN). This paper presents the modeling, simulation, and measurement of a PDN in a high-speed FR4 printed circuit board (PCB) designed for chip-to-chip communication at a data rate of 3.2 Gbps and above. The test board

Chuck Yuan; Joong-ho Kim; Madhavan Swaminathan

2003-01-01

276

Quantum key distribution using multilevel encoding: security analysis

We propose an extension of quantum key distribution based on encoding the key into quNits, i.e. quantum states in an N-dimensional Hilbert space. We estimate both the mutual information between the legitimate parties and the eavesdropper, and the error rate, as a function of the dimension of the Hilbert space. We derive the information gained by an eavesdropper using optimal

Mohamed Bourennane; Anders Karlsson; Gunnar Björk; Nicolas Gisin; Nicolas J. Cerf

2002-01-01

277

Human Face Analysis Based on Distributed 2d Appearance Models

We propose a new framework, called DtTOPS (Dis- tributed Concurrent TOpdown Processing Scheme), for a computer vision system which is suitable in a parallel processing environment. A set of mu1 tiple top-down anal- yses are performed concurrently and distributively. Each of the analyses is based on a different 2D model cam- spanding to a different appearance of a 3D object.

Yasushi Sumi; Yuichi Ohta

1992-01-01

278

Analysis of phase distribution phenomena in microgravity environments

NASA Technical Reports Server (NTRS)

The purpose of the research presented in this paper is to demonstrate the ability of multidimensional two-fluid models for bubbly two-phase flow to accurately predict lateral phase distribution phenomena in microgravity environments. If successful, this research should provide NASA with mechanistically-based analytical methods which can be used for multiphase space system design and evaluation, and should be the basis for future shuttle experiments for model verification.

Lahey, Richard T., Jr.; Bonetto, F.

1994-01-01

279

We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data. PMID:25606744

Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

2014-12-01

280

Periodic analysis of total ozone and its vertical distribution

NASA Technical Reports Server (NTRS)

Both total ozone and vertical distribution ozone data from the period 1957 to 1972 are analyzed. For total ozone, improved monthly zonal means for both hemispheres are computed by weighting individual station monthly means by a factor which compensates for the close grouping of stations in certain regions of latitude bands. Longitudinal variability show maxima in summer in both hemispheres, but, in winter, only in the Northern Hemisphere. The geographical distributions of the long term mean, and the annual, quasibiennial and semiannual waves in total ozone over the Northern Hemisphere are presented. The extratropical amplitude of the annual wave is by far the largest of the three, as much as 120 m atm cm over northern Siberia. There is a tendency for all three waves to have maxima in high latitudes. Monthly means of the vertical distribution of ozone determined from 3 to 8 years of ozonesonde data over North America are presented. Number density is highest in the Arctic near 18 km. The region of maximum number density slopes upward toward 10 N, where the long term mean is 45 x 10 to the 11th power molecules cm/3 near 26 km.

Wilcox, R. W.; Nastrom, G. D.; Belmont, A. D.

1975-01-01

281

Analysis of magnetic electron lens with secant hyperbolic field distribution

NASA Astrophysics Data System (ADS)

Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of the electron beam. Indicators of imaging performance of these instruments, like spatial resolution, have a strong correlation with the focal characteristics of the magnetic lenses, which in turn have been shown to be sensitive to the details of the spatial distribution of the axial magnetic field. Owing to the complexity of designing practical lenses, empirical mathematical expressions are important to obtain the desired focal properties. Thus the degree of accuracy of such models in representing the actual field distribution determines accuracy of the calculations and ultimately the performance of the lens. Historically, the mathematical models proposed by Glaser [1] and Ramberg [2] have been extensively used. In this paper the authors discuss another model with a secant-hyperbolic type magnetic field distribution function, and present a comparison between models, utilizing results from finite element-based field simulations as the reference for evaluating performance.

Pany, S. S.; Ahmed, Z.; Dubey, B. P.

2014-12-01

282

NASA Technical Reports Server (NTRS)

A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

Gyekenyesi, J. P.

1985-01-01

283

This study provides a step-wise analysis of a conceptual grid-based distributed rainfall-runoff model, the United States National Weather Service (US NWS) Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM). It evaluates model parameter sensitivities for annual, monthly, and event time periods with the intent of elucidating the key parameters impacting the distributed model's forecasts. This study demonstrates a methodology that balances

Y. Tang; P. Reed; K. van Werkhoven; T. Wagener

2007-01-01

284

APPROXIMATE NULL DISTRIBUTION OF THE LARGEST ROOT IN MULTIVARIATE ANALYSIS1

The greatest root distribution occurs everywhere in classical multivariate analysis, but even under the null hypothesis the exact distribution has required extensive tables or special purpose software. We describe a simple approximation, based on the Tracy–Widom distribution, that in many cases can be used instead of tables or software, at least for initial screening. The quality of approximation is studied, and its use illustrated in a variety of setttings. PMID:20526465

Johnstone, Iain M.

2010-01-01

285

that include conventional drugs, inactive druglikes, antimicrobial substituents, and bacterial and humanComparative QSAR- and Fragments Distribution Analysis of Drugs, Druglikes, Metabolic Substances, and Antimicrobial Compounds Emre Karakoc, S. Cenk Sahinalp, and Artem Cherkasov*, School of Computing Science, Simon

286

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ...

Hunter, David J.

287

Evolvability Analysis: Distribution of Hyperblobs in a VariableLength Protein Genotype Space

Evolvability Analysis: Distribution of Hyperblobs in a VariableÂLength Protein Genotype SpaceÂgun, Kyoto 619Â0288 Japan Abstract A variableÂlength protein genotype space (the whole aminoÂacid sequenceÂ tional (viable) genotypes is estimated. FuncÂ tional genotypes are assumed to distribute as a `hyperblob

Nehaniv, Chrystopher

288

ANALYSIS AND REGIONALIZATION OF THE DIURNAL DISTRIBUTION OF TORNADOES IN THE UNITED STATES

The central and eastern United States is divided into 157 overlapping square cells. Within each cell, the tornadoes that occurred from 1916 to 1964 are summed in I-hr increments. The resulting histograms are subjected to harmonic analysis. The spatial distributions of the reduction of variance and phase angles of the harmonic components suggest substantial variation in the diurnal distribution of

RICHARD H. SKAGGS

1969-01-01

289

Distributional Effects in a General Equilibrium Analysis of Social Security Laurence J. Kotlikoff

Distributional Effects in a General Equilibrium Analysis of Social Security by Laurence J This paper reviews our recent general equilibrium analyses of the distributional effects of social security realistic pattern of births and length of life. We reach six conclusions. First, SocialSecurity

Spence, Harlan Ernest

290

Postmortem Analysis of Neuron Distributions in the Locus Coeruleus of Alcoholics and Suicidal Victims Donna K. Pauler August 25, 1994 1 Abstract We investigate neuron distribution and morphologyÂsuicidal alcoholics, and suicidal alcoholics. The data consist of postmortem neuron measurements, counts, and spatial

291

NASA Technical Reports Server (NTRS)

A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

Heldenfels, Richard R

1951-01-01

292

MINERALOGICAL ANALYSIS AND URANIUM DISTRIBUTION OF THE SEDIMENTS FROM THE UPPER JACKSON FORMATION KARNES COUNTY, TEXAS A Thesis by PAUL HAROLD FISHMAN Submitted to the Graduate College of Texas A&M University in partial fulfillment... of the reouirement for the degree of MASTER OF SCIENCE December 1978 Major Subject: Geology MINERALOGICAL ANALYSIS AND URANIUM DISTRIBUTION OF THE SEDIMENTS FROM THE UPPER JACKSON FORMATION KARNES COUNTY, TEXAS A Thesis by PAUL HAROLD FISHMAN Approved...

Fishman, Paul Harold

1978-01-01

293

Goodness-of-fit analysis for lumber data

Four different probability distributions were studied to evaluate their relative goodness-of-fit in describing the modulus of rupture (MOR) and modulus of elasticity (MOE) of populations of dimension lumber. The distributions under consideration were the normal, lognormal, Weibull and Johnson's SB. The populations of lumber consisted of 96 data sets of various species groups, mechanical properties, sizes, structural grades and growth

P. J. Pellicane; Fort Collins

1985-01-01

294

NASA Astrophysics Data System (ADS)

This research addresses two aspects of spatially distributed modeling: uncertainty analysis (UA), described as propagation of uncertainty from spatially distributed input factors on model outputs; and sensitivity analysis (SA) defined as assessment of relative importance of spatially distributed factors on the model output variance. An evaluation framework for spatially distributed models is proposed based on a combination of sequential Gaussian simulation (sGs) and the global, variance-based, SA method of Sobol to quantify model output uncertainty due to spatially distributed input factors, together with the corresponding sensitivity measures. The framework is independent of model assumptions; it explores the whole space of input factors, provides measures of factor’s importance (first-order effects) and their interactions (higher-order effects), and assesses the effect of spatial resolution of the model input factors, one of the least understood contributors to uncertainty and sensitivity of distributed models. A spatially distributed hydrological model (Regional Simulation Model, RSM), applied to a site in South Florida (Water Conservation Area-2A), is used as a benchmark for the study. The model domain is spatially represented by triangular elements (average size of 1.1 km2). High resolution land elevation measurements (400 x 400 m, +/-0.15 m vertical error) obtained by the USGS' Airborne Height Finder survey are used in the study. The original survey data (approximately 2,600 points) together with smaller density subsets drawn from this data (1/2, 1/4, 1/8, 1/16, 1/32 of original density) are used for generating equiprobable maps of effective land elevation factor values via sGs. These alternative realizations are sampled pseudo-randomly and used as inputs for model runs. In this way, uncertainty regarding a spatial representation of the elevation surface is transferred into uncertainty of model outputs. The results show that below a specific threshold of data density (1/8), model uncertainty and sensitivity are impacted by the density of land elevation data used for deriving effective land elevation factor values. Below the threshold of data density, uncertainty of model outputs is observed to increase with a decrease of density of elevation data. Similar pattern is observed for the relative importance of sensitivity indexes of the land elevation factor. The results indicate that reduced data density of land elevation could be used without significantly compromising the certainty of RSM predictions and the subsequent decision making process for the specific WCA-2A conditions. The methodology proposed in this research is useful for a model quality control and for guiding field measurement campaigns by optimizing data collection in terms of cost-benefit analysis.

Zajac, Z. B.; Munoz-Carpena, R.; Vanderlinden, K.

2009-12-01

295

Groundwork for Integrated Analysis of Distributed S3C Data

NASA Astrophysics Data System (ADS)

We present a vision for the evolution of existing and future NASA Sun Solar System Connection (S3C) data resources into an integrated data analysis environment. Focusing first on the time-series datasets typical of in- situ measurements, we describe an architecture designed to overcome the traditional barriers to integrated analysis, namely data access and format variations. We are building a groundwork layer on top of which it will be possible to construct common science libraries for plotting and analysis, very similar to the common set of image processing tools developed within the Solar community (for which the data format problem is greatly simplified due to the nearly universal use of FITS). The core of our approach is the use of a set internal data models, one for each type of in-situ science data. We present preliminary versions of the data models for comment. Furthermore, the areas in which data model design have a significant impact on science analysis capabilities will be discussed. The approach presented here is currently deployed in the DataShop analysis tool (http://sd-www.jhuapl.edu/datashop) and is also planned for use in the Virtual Heliospheric Observatory (VHO, http://vho.nasa.gov) and the Virtual Space Physics Observatory (VSPO, http://vspo.gsfc.nasa.gov).

Vandegriff, J.; Roberts, A.; Szabo, A.

2006-05-01

296

Time-Score Analysis in Criterion-Referenced Tests. Final Report.

ERIC Educational Resources Information Center

The family of Weibull distributions was investigated as a model for the distributions of response times for items in computer-based criterion-referenced tests. The fit of these distributions were, with a few exceptions, good to excellent according to the Kolmogorov-Smirnov test. For a few relatively simple items, the two-parameter gamma…

Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.

297

Single Cell Analysis of Drug Distribution by Intravital Imaging

Recent advances in the field of intravital imaging have for the first time allowed us to conduct pharmacokinetic and pharmacodynamic studies at the single cell level in live animal models. Due to these advances, there is now a critical need for automated analysis of pharmacokinetic data. To address this, we began by surveying common thresholding methods to determine which would be most appropriate for identifying fluorescently labeled drugs in intravital imaging. We then developed a segmentation algorithm that allows semi-automated analysis of pharmacokinetic data at the single cell level. Ultimately, we were able to show that drug concentrations can indeed be extracted from serial intravital imaging in an automated fashion. We believe that the application of this algorithm will be of value to the analysis of intravital microscopy imaging particularly when imaging drug action at the single cell level. PMID:23593370

Giedt, Randy J.; Koch, Peter D.; Weissleder, Ralph

2013-01-01

298

Complexity analysis of pipeline mapping problems in distributed heterogeneous networks

Largescale scientific applications require using various system resources to execute complex computing pipelines in distributed networks to support collaborative research. System resources are typically shared in the Internet or over dedicated connections based on their location, availability, capability, and capacity. Optimizing the network performance of computing pipelines in such distributed environments is critical to the success of these applications. We consider two types of largescale distributed applications: (1) interactive applications where a single dataset is sequentially processed along a pipeline; and (2) streaming applications where a series of datasets continuously flow through a pipeline. The computing pipelines of these applications consist of a number of modules executed in a linear order in network environments with heterogeneous resources under different constraints. Our goal is to find an efficient mapping scheme that allocates the modules of a pipeline to network nodes for minimum endtoend delay or maximum frame rate. We formulate the pipeline mappings in distributed environments as optimization problems and categorize them into six classes with different optimization goals and mapping constraints: (1) Minimum Endtoend Delay with No Node Reuse (MEDNNR), (2) Minimum Endtoend Delay with Contiguous Node Reuse (MEDCNR), (3) Minimum Endtoend Delay with Arbitrary Node Reuse (MEDANR), (4) Maximum Frame Rate with No Node Reuse or Share (MFRNNRS), (5) Maximum Frame Rate with Contiguous Node Reuse and Share (MFRCNRS), and (6) Maximum Frame Rate with Arbitrary Node Reuse and Share (MFRANRS). Here, 'contiguous node reuse' means that multiple contiguous modules along the pipeline may run on the same node and 'arbitrary node reuse' imposes no restriction on node reuse. Note that in interactive applications, a node can be reused but its resource is not shared. We prove that MEDANR is polynomially solvable and the rest are NP-complete. MEDANR, where either contiguous or noncontiguous modules in the pipeline can be mapped onto the same node, is essentially the Maximum n-hop Shortest Path problem, and can be solved using a dynamic programming method. In MEDNNR and MFRNNRS, any network node can be used only once, which requires selecting the same number of nodes for onetoone onto mapping. We show its NP-completeness by reducing from the Hamiltonian Path problem. Node reuse is allowed in MEDCNR, MFRCNRS and MFRANRS, which are similar to the Maximum n-hop Shortest Path problem that considers resource sharing. We prove their NP-completeness by reducing from the Disjoint-Connecting-Path Problem and Widest path with the Linear Capacity Constraints problem, respectively.

Lin, Ying [University of Tennessee, Knoxville (UTK); Wu, Qishi [ORNL; Zhu, Mengxia [ORNL; Rao, Nageswara S [ORNL

2009-04-01

299

NASA Technical Reports Server (NTRS)

The strength distribution of fibers within a two-dimensional laminate ceramic/ceramic composite consisting of an eight harness satin weave of Nicalon continuous fiber within a chemically vapor infiltrated SiC matrix was determined from analysis of the fracture mirrors of the fibers. Comparison of the fiber strengths and the Weibull moduli with those for Nicalon fibers prior to incorporation into composites suggests that possible fiber damage may occur either during the weaving or during another stage of the composite manufacture. Observations also indicate that it is the higher-strength fibers which experience the greatest extent of fiber pullout and thus make a larger contribution to the overall composite toughness than do the weaker fibers.

Eckel, Andrew J.; Bradt, Richard C.

1989-01-01

300

A data analysis expert system for large established distributed databases

NASA Technical Reports Server (NTRS)

A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

1987-01-01

301

Cluster analysis of roadside ultrafine particle size distributions

NASA Astrophysics Data System (ADS)

This study reports the diurnal, seasonal, and annual variation of ultrafine particle size distributions in downtown Toronto. The k-means clustering algorithm was applied to five years of size-resolved data for particles with diameters less than 100 nm. Continuous particle number concentrations were measured 16 m from a major arterial roadway between March 2006 and May 2011 using a Fast Mobility Particle Sizer. Eight particle size distribution (PSD) types were identified. The PSD types exhibited distinct weekday-weekend and diurnal patterns. The relative frequency that each PSD occurred varied with season and wind direction and was correlated with other pollutants. These temporal patterns and correlation helped in elucidating the sources and processes that each of the eight PSD represent. Finally, similar PSD types were observed in residential areas located 6 and 15 km away from the central monitoring site suggesting that these PSD types may be generalizable to other sites. Identification of PSD types was found to be a valuable tool to support the interpretation of PSD data so as to elucidate the sources and processes contributing to ultrafine particle concentrations.

Sabaliauskas, Kelly; Jeong, Cheol-Heon; Yao, Xiaohong; Jun, Yun-Seok; Evans, Greg

2013-05-01

302

Studying bubble-particle interactions by zeta potential distribution analysis.

Over a decade ago, Xu and Masliyah pioneered an approach to characterize the interactions between particles in dynamic environments of multicomponent systems by measuring zeta potential distributions of individual components and their mixtures. Using a Zetaphoremeter, the measured zeta potential distributions of individual components and their mixtures were used to determine the conditions of preferential attachment in multicomponent particle suspensions. The technique has been applied to study the attachment of nano-sized silica and alumina particles to sub-micron size bubbles in solutions with and without the addition of surface active agents (SDS, DAH and DF250). The degree of attachment between gas bubbles and particles is shown to be a function of the interaction energy governed by the dispersion, electrostatic double layer and hydrophobic forces. Under certain chemical conditions, the attachment of nano-particles to sub-micron size bubbles is shown to be enhanced by in-situ gas nucleation induced by hydrodynamic cavitation for the weakly interacting systems, where mixing of the two individual components results in negligible attachment. Preferential interaction in complex tertiary particle systems demonstrated strong attachment between micron-sized alumina and gas bubbles, with little attachment between micron-sized alumina and silica, possibly due to instability of the aggregates in the shear flow environment. PMID:25731913

Wu, Chendi; Wang, Louxiang; Harbottle, David; Masliyah, Jacob; Xu, Zhenghe

2015-07-01

303

Measuring the atmospheric organic aerosol volatility distribution: a theoretical analysis

NASA Astrophysics Data System (ADS)

Organic compounds represent a significant fraction of submicrometer atmospheric aerosol mass. Even if most of these compounds are semi-volatile in atmospheric concentrations, the ambient organic aerosol volatility is quite uncertain. The most common volatility measurement method relies on the use of a thermodenuder (TD). The aerosol passes through a heated tube where its more volatile components evaporate, leaving the less volatile components behind in the particulate phase. The typical result of a thermodenuder measurement is the mass fraction remaining (MFR), which depends, among other factors, on the organic aerosol (OA) vaporization enthalpy and the accommodation coefficient. We use a new method combining forward modeling, introduction of "experimental" error, and inverse modeling with error minimization for the interpretation of TD measurements. The OA volatility distribution, its effective vaporization enthalpy, the mass accommodation coefficient and the corresponding uncertainty ranges are calculated. Our results indicate that existing TD-based approaches quite often cannot estimate reliably the OA volatility distribution, leading to large uncertainties, since there are many different combinations of the three properties that can lead to similar thermograms. We propose an improved experimental approach combining TD and isothermal dilution measurements. We evaluate this experimental approach using the same model, and show that it is suitable for studies of OA volatility in the lab and the field.

Karnezi, E.; Riipinen, I.; Pandis, S. N.

2014-09-01

304

Measuring the atmospheric organic aerosol volatility distribution: a theoretical analysis

NASA Astrophysics Data System (ADS)

Organic compounds represent a significant fraction of submicrometer atmospheric aerosol mass. Even if most of these compounds are semi-volatile in atmospheric concentrations, the ambient organic aerosol volatility is quite uncertain. The most common volatility measurement method relies on the use of a thermodenuder (TD). The aerosol passes through a heated tube where its more volatile components evaporate leaving the less volatile behind in the particulate phase. The typical result of a~thermodenuder measurement is the mass fraction remaining (MFR), which depends among other factors on the organic aerosol (OA) vaporization enthalpy and the accommodation coefficient. We use a new method combining forward modeling, introduction of "experimental" error and inverse modeling with error minimization for the interpretation of TD measurements. The OA volatility distribution, its effective vaporization enthalpy, the mass accommodation coefficient and the corresponding uncertainty ranges are calculated. Our results indicate that existing TD-based approaches quite often cannot estimate reliably the OA volatility distribution, leading to large uncertainties, since there are many different combinations of the three properties that can lead to similar thermograms. We propose an improved experimental approach combining TD and isothermal dilution measurements. We evaluate this experimental approach using the same model and show that it is suitable for studies of OA volatility in the lab and the field.

Karnezi, E.; Riipinen, I.; Pandis, S. N.

2014-01-01

305

Analysis of an algorithm for distributed recognition and accountability

Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approach for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.

Ko, C.; Frincke, D.A.; Goan, T. Jr.; Heberlein, L.T.; Levitt, K.; Mukherjee, B.; Wee, C. [California Univ., Davis, CA (United States). Dept. of Computer Science

1993-08-01

306

Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina)] [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States)] [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina) [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

2009-05-22

307

Husimi distribution and phase space analysis of Dicke model quantum phase transition

The Husimi distribution is proposed for a phase space analysis of quantum phase transitions in the Dicke model of spin-boson interactions. We show that the inverse participation ratio and Wehrl entropy of the Husimi distribution give sharp signatures of the quantum phase transition. The analysis has been done using two frameworks: a numerical treatment and an analytical variational approximation. Additionally we have proposed a new characterization of the Dicke model quantum phase transition by means of the zeros of the Husimi distribution in the variational approach.

E. Romera; R. del Real; M. Calixto

2014-09-19

308

The sensitivity of fracture location distribution in brittle materials

NASA Technical Reports Server (NTRS)

The Weibull weak-link theory allows for the computation of distribution functions for both the fracture location and the applied far-field stress. Several authors have suggested using the fracture location information from tests to infer Weibull parameters, and others have used the predictive capabilities of the theory to calculate average fracture locations for brittle bodies. By a simple set of example calculations, it is shown that the fracture location distribution function is distinctly more sensitive to perturbations in the stress state than the fracture stress distribution function is. In general, the average fracture location is more subject to stress perturbations than the average fracture stress. The results indicate that care must be exercised in applying fracture location theory.

Wetherhold, Robert C.

1991-01-01

309

Statistical analysis of slow crack growth experiments

A common approach for the determination of slow crack growth (SCG) parameters are the static and dynamic loading method. Since materials with small Weibull module show a large variability in strength, a correct statistical analysis of the data is indispensable. In this work we propose the use of the Maximum Likelihood Method and a Baysian Analysis, which, in contrast to

Tobias Pfingsten; Karsten Glien

2006-01-01

310

Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

NASA Technical Reports Server (NTRS)

Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

Ozguner, Fusun

1996-01-01

311

Analysis of phase distribution phenomena in microgravity environments

NASA Technical Reports Server (NTRS)

In the past one of NASA's primary emphasis has been on identifying single and multiphase flow experiments which can produce new discoveries that are not possible except in a microgravity environment. While such experiments are obviously of great scientific interest, they do not necessarily provide NASA with the ability to use multiphase processes for power production and/or utilization in space. The purpose of the research presented in this paper is to demonstrate the ability of multidimensional two-fluid models for bubbly two-phase flow to accurately predict lateral phase distribution phenomena in microgravity environments. If successful, this research should provide NASA with mechanistically-based analytical methods which can be used for multiphase space design and evaluation, and should be the basis for future shuttle experiments for model verification.

Lahey, Richard, Jr.; Bonetto, Fabian

1994-01-01

312

Quantitative analysis of inclusion distributions in hot pressed silicon carbide

ABSTRACT Depth of penetration measurements in hot pressed SiC have exhibited significant variability that may be influenced by microstructural defects. To obtain a better understanding regarding the role of microstructural defects under highly dynamic conditions; fragments of hot pressed SiC plates subjected to impact tests were examined. Two types of inclusion defects were identified, carbonaceous and an aluminum-iron-oxide phase. A disproportionate number of large inclusions were found on the rubble, indicating that the inclusion defects were a part of the fragmentation process. Distribution functions were plotted to compare the inclusion populations. Fragments from the superior performing sample had an inclusion population consisting of more numerous but smaller inclusions. One possible explanation for this result is that the superior sample withstood a greater stress before failure, causing a greater number of smaller inclusions to participate in fragmentation than in the weaker sample.

Michael Paul Bakas

2012-12-01

313

Advanced analysis of metal distributions in human hair

A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

Kempson, Ivan M.; Skinner, William M. (U. South Australia)

2008-06-09

314

Power systems analysis for direct current (dc) distribution systems

Many standards, guidelines, etc., currently exist which provide guidance for dc power systems analysis. These documents are scattered throughout the industry (i.e., IEEE, UL, NEMA, GE, etc.), and primarily treat the subject as though hand calculations are being performed. It is the intent of this paper to provide guidance for performing computer aided dc power systems analyzes. This paper will cover load flow/voltage drop and short circuit calculations.

Fleischer, K. [Public Service Electric and Gas, Hancock`s Bridge, NJ (United States). Salem Nuclear Generating Station] [Public Service Electric and Gas, Hancock`s Bridge, NJ (United States). Salem Nuclear Generating Station; Munnings, R.S. [VECTRA Technologies, Inc., Lincolnshire, IL (United States)] [VECTRA Technologies, Inc., Lincolnshire, IL (United States)

1996-09-01

315

Distributed finite element analysis using a transputer network

NASA Technical Reports Server (NTRS)

The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

1989-01-01

316

of the maturation process, a phenomena referred to as T2 time lag compared to T1. The proposed method has poten MRI, longi- tudinal analysis, distribution statistics, MR contrast analysis 1. INTRODUCTION The brain undergoes far more significant changes with respect to shape, structure, size, and chemical composition

Prastawa, Marcel

317

Correspondence analysis is a multivariate technique used to visualizecategorical data, usually data in a two-way contingency table. Someextensions of correspondence analysis to a continuous bivariate distributionare presented, firstly from a canonical correlation analysisperspective and then from a continuous scaling perspective. These extensionsare applied to the Farlie-Gumbel-Morgenstern (FGM) familyof bivariate distributions with given marginals, and also to a generalizationof this...

C. m. Cuadras; J. Fortiana

1998-01-01

318

SIR distribution analysis in cellular networks considering the joint impact of path-loss,

SIR distribution analysis in cellular networks considering the joint impact of path-loss, shadowing.kelif@orange-ftgroup.com coupecho@enst.fr godlewski@enst.fr Abstract In this paper, we propose an analysis of the joint impact or broadcast channels) and thus to attain a certain SIR threshold on these channels with high probability

Coupechoux, Marceau

319

1 Composition and On Demand Deployment of Distributed Brain Activity Analysis Application on Global are brain science and high-energy physics. The analysis of brain activity data gathered from the MEG and analyze brain functions and requires access to large-scale computational resources. The potential platform

Abramson, David

320

In this paper is reported the software development for power systems analysis using several techniques combining the object oriented programming paradigm, computer graphics techniques, distributed data base and software engineering. Three software packages have been implemented: CAPS, SIGVNS and TREINOM. CAPS package is oriented for interactive analysis of power systems and electric markets. SIGVNS is intended for calculation of the

V. L. Paucar; I. O. Almeida; M. J. Rider; M. F. Bedrinana; J. H. Santos

2004-01-01

321

Analysis and visualization of the geographical distribution of atlantic forest bromeliads species

This work presents a spatial distribution analysis of Brazilian Atlantic Forest Bromeliad species catalogued by the Rio de Janeiro Botanical Gardens Research Institute. Our analysis aims at identifying probable endangered species and conservation areas (environmental reservations) for specific Bromeliad species. For that, we propose a solution using Data Mining techniques and the data visualization through the TerraView tool. Thus, it

Stainam N. Brandao; Wagner N. Silva; Luis A. E. Silva; Vladimir Fagundes; Carlos Eduardo R. De Mello; Geraldo Zimbrão; Jano Moreira De Souza

2009-01-01

322

Fluorescence fluctuation methods such as fluorescence correlation spectroscopy and fluorescence intensity distribution analysis (FIDA) have proven to be versatile tools for studying molecular interactions with single molecule sensitivity. Another well-known fluorescence technique is the measurement of the fluorescence lifetime. Here, we introduce a method that combines the benefits of both FIDA and fluorescence lifetime analysis. It is based on fitting

Kaupo Palo; Leif Brand; Christian Eggeling; Stefan Jäger; Peet Kask; Karsten Gall

2002-01-01

323

Consideration of tip speed limitations in preliminary analysis of minimum COE wind turbines

NASA Astrophysics Data System (ADS)

A relation between Cost Of Energy, COE, maximum allowed tip speed, and rated wind speed, is obtained for wind turbines with a given goal rated power. The wind regime is characterised by the corresponding parameters of the probability density function of wind speed. The non-dimensional characteristics of the rotor: number of blades, the blade radial distributions of local solidity, twist, angle, and airfoil type, play the role of parameters in the mentioned relation. The COE is estimated using a cost model commonly used by the designers. This cost model requires basic design data such as the rotor radius and the ratio between the hub height and the rotor radius. Certain design options, DO, related to the technology of the power plant, tower and blades are also required as inputs. The function obtained for the COE can be explored to find those values of rotor radius that give rise to minimum cost of energy for a given wind regime as the tip speed limitation changes. The analysis reveals that iso-COE lines evolve parallel to iso-radius lines for large values of limit tip speed but that this is not the case for small values of the tip speed limits. It is concluded that., as the tip speed limit decreases, the optimum decision for keeping minimum COE values can be: a) reducing the rotor radius for places with high weibull scale parameter or b) increasing the rotor radius for places with low weibull scale parameter.

Cuerva-Tejero, A.; Yeow, T. S.; Lopez-Garcia, O.; Gallego-Castillo, C.

2014-12-01

324

Sensitivity Analysis of Distributed Soil Moisture Profiles by Active Distributed Temperature Sensing

NASA Astrophysics Data System (ADS)

Monitoring and measuring the fluctuations of soil moisture at large scales in the filed remains a challenge. Although sensors based on measurement of dielectric properties such as Time Domain Reflectometers (TDR) and capacity-based probes can guarantee reasonable responses, they always operate on limited spatial ranges. On the other hand optical fibers, attached to a Distribute Temperature Sensing (DTS) system, can allow for high precision soil temperature measurements over distances of kilometers. A recently developed technique called Active DTS (ADTS) and consisting of a heat pulse of a certain duration and power along the metal sheath covering the optical fiber buried in the soil, has proven a promising alternative to spatially-limited probes. Two approaches have been investigated to infer distributed soil moisture profiles in the region surrounding the optic fiber cable by analyzing the temperature variations during the heating and the cooling phases. One directly relates the change of temperature to the soil moisture (independently measured) to develop specific calibration curve for the soil used; the other requires inferring the thermal properties and then obtaining the soil moisture by inversion of known relationships. To test and compare the two approaches over a broad range of saturation conditions a large lysimeter has been homogeneously filled with loamy soil and 52 meters of fiber optic cable have been buried in the shallower 0.8 meters in a double coil rigid structure of 15 loops along with a series of capacity-based sensors (calibrated for the soil used) to provide independent soil moisture measurements at the same depths of the optical fiber. Thermocouples have also been wrapped around the fiber to investigate the effects of the insulating cover surrounding the cable, and in between each layer in order to monitor heat diffusion at several centimeters. A high performance DTS has been used to measure the temperature along the fiber optic cable. Several soil moisture profiles have been generated in the lysimeter either varying the water table height or by wetting the soil from the top. The sensitivity of the ADTS method for heat pulses of different duration and power and ranges of spatial and temporal resolution are presented.

Ciocca, F.; Van De Giesen, N.; Assouline, S.; Huwald, H.; Lunati, I.

2012-12-01

325

Evolution History of Asteroid Itokawa Based on Block Distribution Analysis

NASA Astrophysics Data System (ADS)

This work investigates trends in the global and regional distribution of blocks on asteroid 25143 Itokawa in order to discover new findings to better understand the history of this asteroid. Itokawa is a near-Earth object, and the first asteroid that was targeted for a sample return mission. Trends in block population provide new insights in regards to Itokawa's current appearance following the disruption of a possible parent body, and how its surface might have changed since then. Here blocks are defined as rocks or features with distinctive positive relief that are larger than a few meters in size. The size and distribution of blocks are measured by mapping the outline of the blocks using the Small Body Mapping Tool (SBMT) created by the Johns Hopkins University Applied Physics Laboratory [1]. The SBMT allows the user to overlap correctly geo-located Hayabusa images [2] onto the Itokawa shape model. This study provides additional inferences on the original disruption and subsequent re-accretion of Itokawa's "head" and "body" from block analyses. A new approach is taken by analyzing the population of blocks with respect to latitude for both Itokawa's current state, and a hypothetical elliptical body. Itokawa currently rotates approximately about its maximum moment of inertia, which is expected due to conservation of momentum and minimum energy arguments. After the possible disruption of the parent body of Itokawa, the "body" of Itokawa would have tended to a similar rotation. The shape of this body is made by removing the head of Itokawa and applying a semispherical cap. Using the method of [3] inertial properties of this object are calculated. With the assumption that this object had settled to its stable rotational axis, it is found that the pole axis could have been tilted about 13° away from the current axis in the direction opposite the head, equivalent to a 33 meter change in the center of mass. The results of this study provide means to test the hypothesis of whether or not Itokawa is a contact binary. References: [1] E. G. Kahn, et al. A tool for the visualization of small body data. In LPSC XLII, 2011. [2] A. Fujiwara, et al. The rubble-pile asteroid Itokawa as observed by Hayabusa. Science, 312(5778):1330-1334, June 2006. [3] A. F. Cheng, et al. Small-scale topography of 433 Eros from laser altimetry and imaging. Icarus, 155(1):51-74, 2002

Mazrouei, Sara; Daly, Michael; Barnouin, Olivier; Ernst, Carolyn

2013-04-01

326

Numerical analysis of atomic density distribution in arc driven negative ion sources

The purpose of this study is to calculate atomic (H{sup 0}) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H{sup 0} density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H{sup 0} production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H{sup 0} production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H{sup 0} production and the ionization are enhanced. The calculated H{sup 0} density distribution without the effect of the H{sup 0} transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H{sup 0} distribution.

Yamamoto, T., E-mail: t.yamamoto@ppl.appi.keio.ac.jp; Shibata, T.; Hatayama, A. [Graduate School of Science and Technology, Keio University, 3-14-1 Hiyoshi, Yokohama 223-8522 (Japan)] [Graduate School of Science and Technology, Keio University, 3-14-1 Hiyoshi, Yokohama 223-8522 (Japan); Kashiwagi, M.; Hanada, M. [Japan Atomic Energy Agency (JAEA), 801-1 Mukouyama, Naka 311-0193 (Japan)] [Japan Atomic Energy Agency (JAEA), 801-1 Mukouyama, Naka 311-0193 (Japan); Sawada, K. [Faculty of Engineering, Shinshu University, 4-17-1 Wakasato, Nagano 380-8553 (Japan)] [Faculty of Engineering, Shinshu University, 4-17-1 Wakasato, Nagano 380-8553 (Japan)

2014-02-15

327

Numerical analysis of atomic density distribution in arc driven negative ion sources.

The purpose of this study is to calculate atomic (H(0)) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H(0) density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H(0) production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H(0) production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H(0) production and the ionization are enhanced. The calculated H(0) density distribution without the effect of the H(0) transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H(0) distribution. PMID:24593565

Yamamoto, T; Shibata, T; Kashiwagi, M; Hatayama, A; Sawada, K; Hanada, M

2014-02-01

328

Probability Distribution Function of the Upper Equatorial Pacific Current Speeds

NASA Astrophysics Data System (ADS)

The probability distribution function (PDF) of the upper (0-50 m) tropical Pacific current speeds (w), constructed from hourly ADCP data (1990-2007) at six stations for the TOGA-TAO project, satisfies the two- parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies.

Chu, P. C.

2008-12-01

329

Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...

Alves, Nelson A; Rizzi, Leandro G

2015-01-01

330

This short communication determines the strength of two glass polyalkenoate cements that differ from each other through the composition of their glass phase. Sample sets of n=5, 10, 20 and 30 were formulated and tested in biaxial flexure. The derived mean for each sample set was compared against the Weibull characteristic strength. The mean and corresponding characteristic strength show a maximum percentage difference 10.1%, and the 95% confidence intervals calculated from the mean data encompass the corresponding characteristic strength down to a sample set of n=5. This suggests that, for brittle materials such as glass polyalkenoate cements, it is acceptable to test only five samples of each material in biaxial flexure and the resultant 95% confidence intervals will encompass the corresponding Weibull characteristic strength of the material. PMID:25553555

Mehrvar, Cina; Curran, Declan J; Alhalawani, Adel M F; Boyd, Daniel; Towler, Mark

2015-03-01

331

A new parallel-vector finite element analysis software on distributed-memory computers

A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of

J. Qin; D. T. Nguyen

1993-01-01

332

NASA Technical Reports Server (NTRS)

This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

Gurgiolo, Chris; Vinas, Adolfo F.

2009-01-01

333

Distribution of Modelling Spatial Processes Using Geostatistical Analysis

NASA Astrophysics Data System (ADS)

The Geostatistical Analyst uses sample points taken at different locations in a landscape and creates (interpolates) a continuous surface. The Geostatistical Analyst provides two groups of interpolation techniques: deterministic and geostatistical. All methods rely on the similarity of nearby sample points to create the surface. Deterministic techniques use mathematical functions for interpolation. Geostatistics relies on both statistical and mathematical methods, which can be used to create surfaces and assess the uncertainty of the predictions. The first step in geostatistical analysis is variography: computing and modelling a semivariogram. A semivariogram is one of the significant functions to indicate spatial correlation in observations measured at sample locations. It is commonly represented as a graph that shows the variance in measure with distance between all pairs of sampled locations. Such a graph is helpful to build a mathematical model that describes the variability of the measure with location. Modeling of relationship among sample locations to indicate the variability of the measure with distance of separation is called semivariogram modelling. It is applied to applications involving estimating the value of a measure at a new location. Our work presents the analysis of the data following the steps as given below: identification of data set periods, constructing and modelling the empirical semivariogram for single location and using the Kriging mapping function as modelling of TEC maps in mid-latitude during disturbed and quiet days. Based on the semivariogram, weights for the kriging interpolation are estimated. Additional observations do, in general, not provide relevant extra information to the interpolation, because the spatial correlation is well described with the semivariogram. Keywords: Semivariogram, Kriging, modelling, Geostatistics, TEC

Grynyshyna-Poliuga, Oksana; Stanislawska, Iwona; Swiatek, Anna

334

Reliability assessment in electrical power systems: the Weibull-Markov stochastic model

The field of power system reliability is dominated by the use of homogenous Markov models. The negative exponential distributions in these models, however, are unrealistic in the case of repair or switching times. The use of homogenous Markov models is often justified by arguing that the use of other models makes it impossible to perform analytical or nonsequential calculations. It

Jasper F. L. van Casteren; Math H. J. Bollen; Martin E. Schmieg

2000-01-01

335

Biomechanical analysis of force distribution in human finger extensor mechanisms.

The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the "Principle of Minimum Total Potential Energy" is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576

Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

2014-01-01

336

Distributional benefit analysis of a national air quality rule.

Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA) must perform environmental justice (EJ) reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA's Heavy Duty Diesel (HDD) rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups' baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well. PMID:21776207

Post, Ellen S; Belova, Anna; Huang, Jin

2011-06-01

337

Biomechanical Analysis of Force Distribution in Human Finger Extensor Mechanisms

The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the “Principle of Minimum Total Potential Energy” is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576

Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

2014-01-01

338

CONVERGENCE ANALYSIS OF CONSENSUS-BASED DISTRIBUTED CLUSTERING Pedro A. Forero, Alfonso Cano of the algorithm and its stability analysis. Index Terms-- Clustering methods, Unsupervised learn- ing, Distributed ABSTRACT This paper deals with clustering of spatially distributed data using wireless sensor networks

Pleite, Alfonso Cano

339

Monte Carlo models and analysis of galactic disk gamma-ray burst distributions

NASA Technical Reports Server (NTRS)

Gamma-ray bursts are transient astronomical phenomena which have no quiescent counterparts in any region of the electromagnetic spectrum. Although temporal and spectral properties indicate that these events are likely energetic, their unknown spatial distribution complicates astrophysical interpretation. Monte Carlo samples of gamma-ray burst sources are created which belong to Galactic disk populations. Spatial analysis techniques are used to compare these samples to the observed distribution. From this, both quantitative and qualitative conclusions are drawn concerning allowed luminosity and spatial distributions of the actual sample. Although the Burst and Transient Source Experiment (BATSE) experiment on Gamma Ray Observatory (GRO) will significantly improve knowledge of the gamma-ray burst source spatial characteristics within only a few months of launch, the analysis techniques described herein will not be superceded. Rather, they may be used with BATSE results to obtain detailed information about both the luminosity and spatial distributions of the sources.

Hakkila, Jon

1989-01-01

340

FASEP ultra-automated analysis of fibre length distribution in glass-fibre-reinforced products

NASA Astrophysics Data System (ADS)

Reinforced plastic materials are widely used in high sophisticated applications. The length distribution of the fibres influences the mechanical properties of the final product. A method for automatic determination of this length distribution was developed. After separating the fibres out of the composite material without any damage, and preparing them for microscopical analysis, a mosaic of microscope pictures is taken. After image processing and analysis with mathematical methods, a complete statistic of the fibre length distribution could be determined. A correlation between fibre length distribution and mechanical properties, measured e.g. with material test methods, like tensile and impact tests, was found. This is a method to optimize the process and selection of material for the plastic parts. In result this enhances customer satisfaction and, maybe much more important, reduces costs for the manufacturer.

Hartwich, Mark R.; Höhn, Norbert; Mayr, Helga; Sandau, Konrad; Stengler, Ralph

2009-06-01

341

NASA Astrophysics Data System (ADS)

In this work we report an experimental modal analysis of a cantilever beam, carried out by use of a Brillouin optical time-domain analysis (BOTDA) setup operated at a fixed pump-probe frequency shift. The employed technique permitted us to carry out distributed strain measurements along the vibrating beam at a maximum acquisition rate of 108 Hz. The mode shapes of the first three bending modes (1.7, 10.8, 21.6 Hz) were measured for the structure under test. The good agreement between the experimental and numerical results based on a finite-element method (FEM) analysis demonstrates that Brillouin based distributed sensors are well suited to perform the modal analysis of a vibrating structure. This type of analysis may be useful for applications in structural health monitoring where changes in mode shapes are used as indicators of the damage to the structure.

Minardo, Aldo; Coscetta, Agnese; Pirozzi, Salvatore; Bernini, Romeo; Zeni, Luigi

2012-12-01

342

Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

NASA Astrophysics Data System (ADS)

(note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

Singh, R.; Percivall, G.

2009-12-01

343

QCD Analysis for Nuclear Parton Distributions in the Next to Leading Order

A QCD analysis of the nuclear parton distributions and structure functions in the NLO is performed by using the world data. By having bounded parton distributions for a nuclear with atomic number A, we can obtain the nuclear structure function in x space. Our results for nuclear structure function ratio $F^A_2 /F^D_2$ for some different values of A, are in good agreement with the experimental data. We compare our results for LO and NLO approximation.

S. Atashbar Tehrani; Ali N. Khorramian

2008-01-19

344

Design and analysis of an ultrawide-band distributed CMOS mixer

This paper presents the design and analysis of a novel distributed CMOS mixer for ultrawide-band (UWB) receivers. To achieve the UWB RF frequency range required for the UWB communications, the proposed mixer incorporates artificial inductance-capacitance (LC) delay lines in radio frequency (RF), local oscillator (LO), and intermediate frequency signal paths, and single-balanced mixer cells that are distributed along these LC

Amin Q. Safarian; Ahmad Yazdi; Payam Heydari

2005-01-01

345

Preliminary analysis of the span-distributed-load concept for cargo aircraft design

A simplified computer analysis of the span-distributed-load airplane (in which payload is placed within the wing structure) has shown that the span-distributed-load concept has high potential for application to future air cargo transport design. Significant increases in payload fraction over current wide-bodied freighters are shown for gross weights in excess of 0.5 Gg (1,000,000 lb). A cruise-matching calculation shows that

A. H. Jr

1975-01-01

346

Development of distribution system reliability and risk analysis models. Final report

This report describes work completed in fulfillment of EPRI Research Project 1356-1, Develop Distribution System Reliability and Risk Analysis Model. This volume comprises the background material needed to determine the state-of-the-art in reliability assessment techniques used by the distribution sector of the electric power utility industry. The research identified two distinct reliability assessment methods, namely historical assessment and predictive assessment.

T. D. Vismor; J. E. D. Northcote-Green; R. Billinton

1981-01-01

347

. In this paper we analyze the distribution of the gender wage gap. Using microdata for Switzerland we estimate conditional\\u000a wage distribution functions and find that the total wage gap and its discrimination component are not constant over the range\\u000a of wages. At low wages an overproportional part of the wage gap is due to discrimination. In a further analysis

Dorothe Bonjour; Michael Gerfin

2001-01-01

348

We measured size distributions in model wetlands to detect stressor effects at the community level. Two experiments investigated the individual and combined effects of methyl mercury, chlorpyrifos, atrazine, monosodium methane arsonate, and UV-B light on the system. The statistical analysis of the metric using size distributions, which integrated information about organisms 0.2–4750 µm in size, detected effects in the planktonic

Robert M. Baca; Stephen T. Threlkeld

2000-01-01

349

NASA Astrophysics Data System (ADS)

In the statistical analysis of survey data, a large number of data points having a continuously distributed observed variable may be grouped into ranges of constant width, a process known as "binning". For example, galaxies are often placed into groups by redshift. In binning data, a certain amount of information about the object is often lost: any information at a higher degree of accuracy than needed to place it into a bin is discarded. A methodology is proposed for the determination of population distributions allowing for full retention of the measured value for each observation in cases where the uncertainties are expected to be Gaussian. If the uncertainties are normally distributed, a distinct Gaussian function may be determined for each measurement, with the observed value as the mean and the uncertainty as the standard deviation. These Gaussian distributions - one for each observation - are then summed, creating a continuous probability density distribution. In this manner, all available data may be incorporated into the final population distribution without loss of information due to binning. Because the contribution of each individual point to the overall distribution falls off exponentially with distance from the point, small variations from a true Gaussian for the errors will not significantly impact the final overall distribution. Therefore, this technique may be applied for error distributions that are only approximately Gaussian. This research has been supported in part by the Scott Smith Research Fund at the University of Toledo Department of Physics and Astronomy.

Corliss, David

2010-01-01

350

This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936

Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.

2012-01-01

351

Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376

Taravat, Alireza; Oppelt, Natascha

2014-01-01

352

Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376

Taravat, Alireza; Oppelt, Natascha

2014-01-01

353

In this study, we performed thermal stress analyses on a ventilated disk brake with a three-dimensional model for two cases (whether the pressure distribution on a contact surface is uniform or not). A pressure distribution analysis was performed to determine the pressure distribution on the contact surface. Then, by using the results that were obtained from the pressure distribution analyses,

Dae-Jin Kim; Young-Min Lee; Jae-Sil Park; Chang-Sung Seok

2008-01-01

354

Photon correlation spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS. -from Author

Rees, T.F.

1990-01-01

355

A new method for the size-distribution analysis of polymers by sedimentation velocity analytical ultracentrifugation is described. It exploits the ability of Lamm equation modeling to discriminate between the spreading of the sedimentation boundary arising from sample heterogeneity and from diffusion. Finite element solutions of the Lamm equation for a large number of discrete noninteracting species are combined with maximum entropy regularization to represent a continuous size-distribution. As in the program CONTIN, the parameter governing the regularization constraint is adjusted by variance analysis to a predefined confidence level. Estimates of the partial specific volume and the frictional ratio of the macromolecules are used to calculate the diffusion coefficients, resulting in relatively high-resolution sedimentation coefficient distributions c(s) or molar mass distributions c(M). It can be applied to interference optical data that exhibit systematic noise components, and it does not require solution or solvent plateaus to be established. More details on the size-distribution can be obtained than from van Holde-Weischet analysis. The sensitivity to the values of the regularization parameter and to the shape parameters is explored with the help of simulated sedimentation data of discrete and continuous model size distributions, and by applications to experimental data of continuous and discrete protein mixtures. PMID:10692345

Schuck, P

2000-01-01

356

Powerlaw: a Python package for analysis of heavy-tailed distributions.

Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

2014-01-01

357

powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions

Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

2014-01-01

358

NASA Astrophysics Data System (ADS)

In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.

Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko

359

An analysis of the size distribution of Italian firms by age

NASA Astrophysics Data System (ADS)

In this paper we analyze the size distribution of Italian firms by age. In other words, we want to establish whether the way that the size of firms is distributed varies as firms become old. As a proxy of size we use capital. In [L.M.B. Cabral, J. Mata, On the evolution of the firm size distribution: Facts and theory, American Economic Review 93 (2003) 1075-1090], the authors study the distribution of Portuguese firms and they find out that, while the size distribution of all firms is fairly stable over time, the distributions of firms by age groups are appreciably different. In particular, as the age of the firms increases, their size distribution on the log scale shifts to the right, the left tails becomes thinner and the right tail thicker, with a clear decrease of the skewness. In this paper, we perform a similar analysis with Italian firms using the CEBI database, also considering firms’ growth rates. Although there are several papers dealing with Italian firms and their size distribution, to our knowledge a similar study concerning size and age has not been performed yet for Italy, especially with such a big panel.

Cirillo, Pasquale

2010-02-01

360

Sensitivity Analysis of CLIMEX Parameters in Modelling Potential Distribution of Lantana camara L.

A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling. PMID:22815881

Taylor, Subhashni; Kumar, Lalit

2012-01-01

361

Sensitivity analysis of CLIMEX parameters in modelling potential distribution of Lantana camara L.

A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling. PMID:22815881

Taylor, Subhashni; Kumar, Lalit

2012-01-01

362

Using CLIMEX and the Taguchi Method, a process-based niche model was developed to estimate potential distributions of Phoenix dactylifera L. (date palm), an economically important crop in many counties. Development of the model was based on both its native and invasive distribution and validation was carried out in terms of its extensive distribution in Iran. To identify model parameters having greatest influence on distribution of date palm, a sensitivity analysis was carried out. Changes in suitability were established by mapping of regions where the estimated distribution changed with parameter alterations. This facilitated the assessment of certain areas in Iran where parameter modifications impacted the most, particularly in relation to suitable and highly suitable locations. Parameter sensitivities were also evaluated by the calculation of area changes within the suitable and highly suitable categories. The low temperature limit (DV2), high temperature limit (DV3), upper optimal temperature (SM2) and high soil moisture limit (SM3) had the greatest impact on sensitivity, while other parameters showed relatively less sensitivity or were insensitive to change. For an accurate fit in species distribution models, highly sensitive parameters require more extensive research and data collection methods. Results of this study demonstrate a more cost effective method for developing date palm distribution models, an integral element in species management, and may prove useful for streamlining requirements for data collection in potential distribution modeling for other species as well. PMID:24722140

Shabani, Farzin; Kumar, Lalit

2014-01-01

363

State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

Doris, E.; Krasko, V.A.

2012-10-01

364

NASA Astrophysics Data System (ADS)

The variational methods widely used for other environmental systems are applied to a spatially distributed flash flood model coupling kinematic wave overland flow and Green Ampt infiltration. Using an idealized configuration where only parametric uncertainty is addressed, the potential of this approach is illustrated for sensitivity analysis and parameter estimation. Adjoint sensitivity analysis provides an extensive insight into the relation between model parameters and the hydrological response and enables the use of efficient gradient based optimization techniques.

Castaings, W.; Dartus, D.; Le Dimet, F.-X.; Saulnier, G.-M.

2007-02-01

365

STAT 525 Notes on the log-logistic hazard and survreg in R The log-logistic distribution is defined as the exponentiation of a logistic variable, which is a location-scale family. We may parameterize the log-logistic variable is called a standard logistic random variable. Recall that the Weibull distribution is the only

Hunter, David

366

Can Data Recognize Its Parent Distribution?

This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

A.W.Marshall; J.C.Meza; and I. Olkin

1999-05-01

367

The distributed dislocation method applied to the analysis of elastoplastic strain concentrations

NASA Astrophysics Data System (ADS)

Conventionally, the use of continuous distributions of dislocations to model plasticity has been confined to the analysis of crack tip plasticity using linear arrays of dislocations, within the framework of plane analysis. By expanding this technique into a distribution of dislocation over an area, a method is developed to model the plasticity at stress raising features such as notches or holes under plane strain conditions. The method explicitly takes account of the boundary conditions by using a dislocation solution which accounts for the presence of the stress-raise itself. Other free boundaries may be modelled more approximately using boundary elements which also correctly include the presence of the stress raiser. The dislocations are distributed over finite sized cells, and the solutions found for the strain fields compare favourably with both finite element and bounding Neuber and Glinka results.

Blomerus, P. M.; Hills, D. a.; Kelly, P. a.

1999-04-01

368

The analysis of reaction time (RT) distributions has become a recognized standard in studies on the stimulus response correspondence (SRC) effect as it allows exploring how this effect changes as a function of response speed. In this study, we compared the spatial SRC effect (the classic Simon effect) with the motion SRC effect using RT distribution analysis. Four experiments were conducted, in which we manipulated factors of space position and motion for stimulus and response, in order to obtain a clear distinction between positional SRC and motion SRC. Results showed that these two types of SRC effects differ in their RT distribution functions as the space positional SRC effect showed a decreasing function, while the motion SRC showed an increasing function. This suggests that different types of codes underlie these two SRC effects. Potential mechanisms and processes are discussed. PMID:24605178

Styrkowiec, Piotr; Szczepanowski, Remigiusz

2013-01-01

369

Analysis of the melanin distribution in different ethnic groups by in vivo laser scanning microscopy

NASA Astrophysics Data System (ADS)

The aim of this study was to determine whether Laser scanning confocal microscopy (LSM) is able to visualize differences in melanin content and distribution in different Skin Phototypes. The investigations were carried out on six healthy volunteers with Skin Phototypes II, IV, and VI. Representative skin samples of Skin Phototypes II, V, and VI were obtained for histological analysis from remaining tissue of skin grafts and were used for LSM-pathologic correlation. LSM evaluation showed significant differences in melanin distribution in Skin Phototypes II, IV, and VI, respectively. Based on the differences in overall reflectivity and image brightness, a visual evaluation scheme showed increasing brightness of the basal and suprabasal layers with increasing Skin Phototypes. The findings correlated well with histological analysis. The results demonstrate that LSM may serve as a promising adjunctive tool for real time assessment of melanin content and distribution in human skin, with numerous clinical applications and therapeutic and preventive implications.

Antoniou, C.; Lademann, J.; Richter, H.; Astner, S.; Patzelt, A.; Zastrow, L.; Sterry, W.; Koch, S.

2009-05-01

370

Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.

A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed. PMID:20354691

Mi?kiewicz, Janusz; Trela, Zenon; Przestalski, Stanis?aw; Karcz, Waldemar

2010-09-01

371

Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis

Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis Senbo Xiao, Wolfram of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk-strain relationships of four different models, from spider and Bombyx mori silk peptides, in antiparallel and parallel

GrÃ¤ter, Frauke

372

Analysis of liquid–liquid distribution constants of organophospohorus based extractants

Linear solvation energy relationships (LSER) analysis of distribution of organophosphorus based solutes between aqueous phase and organic solvents has been performed applying the Kamlet and Taft, as well as Leggett models of solvent effects.A correlation permitting the prediction of unknown and\\/or correction of uncertain values of Kabachnik induction constants from the structure of alkyl and alkoxyl substituents attached to phosphorus

Wies?aw Apostoluk; Waldemar Robak

2005-01-01

373

Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks

Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks Xingyan Li and Lynne E. Parker Proc. of IEEE International Conference on Robotics and Automation, Kobe, Japan multi-robot team tasks. While the centralized version of SAFDetection was shown to be successful

Parker, Lynne E.

374

Technology Transfer Automated Retrieval System (TEKTRAN)

Starch was isolated from flour of four wheats representing hard red winter (Karl), hard red spring (Gunner), durum (Belfield 3), and spelt (WK 86035-8) wheat classes. Digital image analysis (IA) coupled to light microscopy was used to determine starch size distributions where the volume of granules...

375

A Study of Optimal Mean Photon Number Analysis in Quantum Key Distribution (QKD)

This paper presents a study of optimal mean photon number (mu) analysis in quantum key distribution (QKD) experiments. The rationale of this study is to analyze the eavesdropping technology assumptions regarding the optimal mean photon number. This work limits its scope only on the fiber-based QKD implementation.

L. I. A. Ghazali; W. A. W. Adnan; M. Mokhtar; A. F. Abas; M. A. Mahdi

2008-01-01

376

Regional Distribution of Consultancy Firms Servicing the MAPCON Scheme: A Preliminary Analysis

VINCENT P., CHELL E. and HAWORTH J. (1987) Regional distribution of consultancy firms servicing the MAPCON scheme: a preliminary analysis, Reg. Studies21, 505–518. This paper examines published data pertaining to the MAPCON scheme which is one of a number of government schemes aimed at the revival of British industry. The paper outlines the argument regarding the role of information in

Peter Vincent; Elizabeth Chell; Jean Haworth

1987-01-01

377

Novel technique for current density distribution analysis of solidly modeled coil

In this paper, a novel technique for current density distribution analysis of solidly modeled filamentary coil is proposed. The proposed method can totally solve the disadvantages of the conventional methods based on current vector potential formulation. The method is verified by the applications to a simple test problem and a coil of deflection yoke model

Chang-Hwan Im; Hong-Kyu Kim; Hyun-Kyo Jung

2002-01-01

378

The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values

165 The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have

Omiecinski, Curtis

379

Dwell-Time Distribution Analysis of Polyprotein Unfolding Using Force-Clamp Spectroscopy

Dwell-Time Distribution Analysis of Polyprotein Unfolding Using Force-Clamp Spectroscopy Jasna developed single molecule force-clamp technique we quantitatively measure the kinetics of conformational changes of polyprotein molecules at a constant force. In response to an applied force of 110 pN, we

Fernandez, Julio M.

380

An Ecological Analysis of the Geographic Distribution of Veterinarians in the United States

ERIC Educational Resources Information Center

Measures of the ecological characteristics of states were developed through factor analysis. Then ecological characteristics of states and cities were related to the geographic distribution of veterinarians and physicians. Population size is the strongest correlate of the number of health professionals. Results for pet veterinarians resemble…

Richards, James M., Jr.

1977-01-01

381

Nonsmooth analysis and sonar-based implementation of distributed coordination algorithms

Nonsmooth analysis and sonar-based implementation of distributed coordination algorithms Craig L mobile robots equipped with sonar. We develop novel approaches for improving single point sonar scan) equipped with rotating sonar sensors is used. Successful implementation of the algorithms is largely

Brennan, Sean

382

VISUAL INTERFACE FOR THE CONCEPT DISTRIBUTION ANALYSIS IN VIDEO SEARCH RESULTS

. Despite the current performance of the 'traditional' search engines, the video search engines.2 Indexation and Tagging 2.2.1 Tagging The method currently most used in search engines is the tagging (ChengVISUAL INTERFACE FOR THE CONCEPT DISTRIBUTION ANALYSIS IN VIDEO SEARCH RESULTS Multi

Paris-Sud XI, Université de

383

Overlap distributions and taxonomy analysis of spin glass states with equal weights

499 Overlap distributions and taxonomy analysis of spin glass states with equal weights N. Parga) Résumé. 2014 Nous utilisons des techniques de taxonomie numérique pour vérifier l'ultramétricité des entre échantillons disparaissent. Abstract. 2014 Techniques of numerical taxonomy are used to make

Paris-Sud XI, Université de

384

Structure-Independent Analysis of the Breadth of the Positional Distribution of Disordered Groups in

association.3-5 Experi- mental analysis, however, of the positional distribution of disordered groups both extrinsic fluorophores are attached to the macromolecule through mobile linkers; small-angle X-ray insensitive to translational motions.9 These issues have hindered experi- mental studies on the positional

Clore, G. Marius

385

Distributional analysis of regional benefits and cost of air quality control

The methodology and results of an analysis of benefits and costs of air quality control for an urban region in Florida are given. The machinery used considers the spatial distribution of: (a) emission sources; (b) the ambient levels resulting from local meteorological conditions and geographic features; and (c) the socioeconomic characteristics of the impacted population groups. This facilitates an examination

E. T. Loehman; S. V. Berg; A. A. Arroyo; R. A. Hedinger; J. M. Schwartz; M. E. Shaw; R. W. Fahien; V. H. De; R. P. Fishe; D. E. Rio

1979-01-01

386

Many areas of ecological inquiry require the ability to detect and characterize change in ecological variables across both space and time. The purpose of this study was to investigate ways in which geographic boundary analysis techniques could be used to characterize the pattern of change over space in plant distributions in a forested wetland mosaic. With vegetation maps created using

Kimberly R. Hall; Susan L. Maruca

2001-01-01

387

Exploratory Data Analysis to Identify Factors Influencing Spatial Distributions of Weed Seed Banks

Technology Transfer Automated Retrieval System (TEKTRAN)

Comparing distributions of different species in multiple fields will help us understand the spatial dynamics of weed seed banks, but analyzing observational data requires non-traditional statistical methods. We used classification and regression tree analysis (CART) to investigate factors that influ...

388

The biancana landscapes, rather common in several Italian areas, have a very complex morphology. Different geomorphic features often occur along the same hillslope. From a morphological survey of an experimental site in southern Tuscany, the forms were classified, and their distribution analysed. Spatial analysis of the biancane provided insights into the range of forces responsible for their formation and evolution.

Costanza Calzolari; Fabrizio Ungaro

1998-01-01

389

Distributional Impacts of Nationwide Car Road Pricing: A CGE Analysis for Austria*

Emissions, Passenger Transport Policy, Income Distribution. JEL: C68, D58, H23, R48. #12;- 3 - 1 of the road pricing burden. In our analysis we focuses on the CGE modelling structure of private transport households. The paper proceeds in four steps. First, the transport and consumption databases are merged

Steininger, Karl W.

390

Analysis of a cone-based distributed topology control algorithm for wireless multi-hop networks

The topology of a wireless multi-hop network can be controlled by varying the transmission power at each node. In this paper, we give a detailed analysis of a cone-based distributed topology control algorithm. This algorithm, introduced in [16], does not assume that nodes have GPS information available; rather it depends only on directional information. Roughly speaking, the basic idea of

Li Li; Joseph Y. Halpern; Paramvir Bahl; Yi-Min Wang; Roger Wattenhofer

2001-01-01

391

Analysis of Saturn's thermal emission at 2.2-cm wavelength: Spatial distribution of ammonia vapor

Analysis of Saturn's thermal emission at 2.2-cm wavelength: Spatial distribution of ammonia vapor A, United States a r t i c l e i n f o Article history: Available online 27 June 2013 Keywords: Saturn a c t This work focuses on determining the latitudinal structure of ammonia vapor in Saturn's cloud

392

An investigation on the intra-sample distribution of cotton color by using image analysis

Technology Transfer Automated Retrieval System (TEKTRAN)

The colorimeter principle is widely used to measure cotton color. This method provides the sample’s color grade; but the result does not include information about the color distribution and any variation within the sample. We conducted an investigation that used image analysis method to study the ...

393

This paper presents stability analysis of a distributed generation (DG) controller, in an islanded mode, based on the Mapping Theorem and the Zero Exclusion Condition, and validates the results based on a laboratory scale experimental setup. The DG unit is interfaced to the host system through a voltage-sourced converter (VSC). The control strategy regulates the load voltage at the desired

Behrooz Bahrani; Houshang Karimi; Reza Iravani

2010-01-01

394

SPE 167844 Geographically-Distributed Databases: A Big Data Technology for Production Analysis advances in the scientific field of "big-data" to the world of Oil & Gas upstream industry. These off-of-the-start IT technologies currently employed in the data management of Oil & Gas production operations. Most current

395

An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

ERIC Educational Resources Information Center

Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

Attali, Yigal

2010-01-01

396

Measurement and analysis of residence time distribution (RTD) data is the focus of this study where data collection methods were developed specifically for the pretreatment reactor environment. Augmented physical sampling and automated online detection methods were developed and applied. Both the measurement techniques themselves and the produced RTD data are presented and discussed.

Sievers, D.; Kuhn, E.; Tucker, M.; Stickel, J.; Wolfrum, E.

2013-06-01

397

Fluorescence energy transfer is widely used for determination of intramolecular distances in macromolecules. The time dependence of the rate of energy transfer is a function of the donor/acceptor distance distribution and fluctuations between the various conformations which may occur during the lifetime of the excited state. Previous attempts to recover both distance distributions and segmental diffusion from time-resolved experiments have been unsuccessful due to the extreme correlation between fitting parameters. A method has been developed, based on global analysis of both donor and acceptor fluorescence decay curves, which overcomes this extreme cross-correlation and allows the parameters of the equilibrium distance distributions and intramolecular diffusion constants to be recovered with high statistical significance and accuracy. Simulation studies of typical intramolecular energy transfer experiments reveal that both static and dynamic conformational distribution information can thus be obtained at a single temperature and viscosity. PMID:2765658

Beechem, J M; Haas, E

1989-01-01

398

NASA Astrophysics Data System (ADS)

Tests and calibration of sprayers have been considered a very important task for chemicals use reduction in agriculture and for improvement of plant phytosanitary protection. A reliable, affordable and easy-to-use method to observe the distribution in the field is required and the infrared thermoimage analysis can be considered as a potential method based on non-contact imaging technologies. The basic idea is that the application of colder water (10 °C less) than the leaves surface makes it possible to distinguish and measure the targeted areas by means of a infrared thermoimage analysis based on significant and time persistent thermal differences. Trials were carried out on a hedge of Prunus laurocerasus, 2.1 m height with an homogenous canopy. A trailed orchard sprayer was employed with different spraying configurations. A FLIR TM (S40) thermocamera was used to acquire (@ 50 Hz) thermal videos, in a fixed position, at frame rate of 10 images/s, for nearly 3 min. Distribution quality was compared to the temperature differences obtained from the thermal images between pre-treatment and post-treatment (?T)., according two analysis: time-trend of ?T average values for different hedge heights and imaging ?T distribution and area coverage by segmentation in k means clustering after 30 s of spraying. The chosen spraying configuration presented a quite good distribution for the entire hedge height with the exclusion of the lower (0-1 m from the ground) and the upper part (>1.9 m). Through the image segmentation performed of ?T image by k-means clustering, it was possible to have a more detailed and visual appreciation of the distribution quality among the entire hedge. The thermoimage analysis revealed interesting potentiality to evaluate quality distribution from orchards sprayers.

Menesatti, P.; Biocca, M.

2007-09-01

399

Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline

The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications, documentation and usage are available online (http://Pipeline.loni.ucla.edu). PMID:19649168

Dinov, Ivo D.; Van Horn, John D.; Lozev, Kamen M.; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; MacKenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S.; Toga, Arthur W.

2009-01-01

400

Probability distributions of land surface wind speeds over North America

NASA Astrophysics Data System (ADS)

Knowledge of the probability distributions of surface wind speeds (SWS) is essential for surface flux estimation, wind power estimation, and wind risk assessments. The two-parameter Weibull distribution is the most widely used empirical distribution for SWS. This study considers the probability density function (PDF) of 3-hourly observations from 720 weather stations over North America for the period 1979-1999. The PDF of SWS is classified by season, time of day, and land surface type. The Weibull PDF is characterized by a particular relationship between the mean, standard deviation, and skewness. While the moments of the observed daytime SWS PDF are found to collapse around this Weibull relationship, the observed nighttime PDF has a broader range of values and is significantly more skewed than the Weibull PDF over rough surfaces. An idealized model shows that SWS skewness has a much greater rate of change with both the mean and standard deviation of surface buoyancy flux under conditions of stable stratification than that of unstable stratification. This result suggests that surface buoyancy flux plays an important role in generating diurnal variation of SWS PDF. Two global reanalyses products (ERA-40 and NCEP-NCAR) and three regional climate models (RCMs) (Rossby Centre Atmospheric Model version 3 (RCA3), limited area version of Global Environmental Multiscale Model (GEM-LAM), and Canadian Regional Climate Model, version 4 (CRCM4)) all have a less skewed nighttime PDF and a more narrow range of the normal wind speed during day and night. Among them, two of the RCMs capture the observed SWS differences across different land cover types, and only one of the RCMs produces the observed seasonal peak of SWS PDF.

He, Yanping; Monahan, Adam Hugh; Jones, Colin G.; Dai, Aiguo; Biner, Sebastien; Caya, Daniel; Winger, Katja

2010-01-01

401

Study of Solid State Drives performance in PROOF distributed analysis system

NASA Astrophysics Data System (ADS)

Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

2010-04-01

402

Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions.

Gaite, José, E-mail: jose.gaite@upm.es [Instituto de Microgravedad IDR, EIAE, Universidad Politécnica de Madrid, Pza. Cardenal Cisneros 3, E-28040 Madrid (Spain)

2010-03-01

403

Generalized recurrence plots for the analysis of images from spatially distributed systems

NASA Astrophysics Data System (ADS)

We propose a new method for the analysis of images showing patterns emerging from the evolution of spatially distributed systems. The generalized recurrence plot (GRP) and the generalized recurrence quantification analysis (GRQA) are exploited for the investigation of such patterns. We focus on snapshots of spatio-temporal processes such as the formation of Turing structures and traveling waves in the Belousov-Zhabotinsky reaction, satellite images of spatial chlorophyll distribution in seas and oceans (similar to turbulent flows), colonies of Dyctiostelium discoideum, fractals, and noise. The method is based on the GRP and GRQA and particularly on the measures determinism (DET) and entropy (ENT), providing a new criterion for the assessment and classification of images based on the simultaneous evaluation of their global and local structure. The DET-ENT diagram is introduced and compared with the classical image analysis entropy defined on the pixels’ values. The method proposed provides appealing performances in the case of images showing complex spatial patterns.

Facchini, Angelo; Mocenni, Chiara; Vicino, Antonio

2009-01-01

404

NASA Technical Reports Server (NTRS)

Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

Leybold, H. A.

1971-01-01

405

Water distribution system vulnerability analysis using weighted and directed network models

NASA Astrophysics Data System (ADS)

The reliability and robustness against failures of networked water distribution systems are central tenets of water supply system design and operation. The ability of such networks to continue to supply water when components are damaged or fail is dependent on the connectivity of the network and the role and location of the individual components. This paper employs a set of advanced network analysis techniques to study the connectivity of water distribution systems, its relationship with system robustness, and susceptibility to damage. Water distribution systems are modeled as weighted and directed networks by using the physical and hydraulic attributes of system components. A selection of descriptive measurements is utilized to quantify the structural properties of benchmark systems at both local (component) and global (network) scales. Moreover, a novel measure of component criticality, the demand-adjusted entropic degree, is proposed to support identification of critical nodes and their ranking according to failure impacts. The application and value of this metric is demonstrated through two case study networks in the USA and UK. Discussion focuses on the potential for gradual evolution of abstract graph-based tools and techniques to more practical network analysis methods, where a theoretical framework for the analysis of robustness and vulnerability of water distribution networks to better support planning and management decisions is presented.

Yazdani, Alireza; Jeffrey, Paul

2012-06-01

406

Evolution of the ATLAS PanDA Production and Distributed Analysis System

NASA Astrophysics Data System (ADS)

The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDA has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.

Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Walker, R.; Stradling, A.; Fine, V.; Potekhin, M.; Panitkin, S.; Compostella, G.

2012-12-01

407

Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model

NASA Astrophysics Data System (ADS)

As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly growing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis (SA) method for snow cover fraction input data (SCF) for a distributed rainfall-runoff model to investigate if the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focused on the relation between the SCF sensitivity and the physical, spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland for which a distributed WetSpa model is setup to simulate two years of daily runoff. The SA uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which uses different response functions for each 4 km × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as: geomorphology, soil texture, land-use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for the spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model.

Berezowski, T.; Nossent, J.; Chorma?ski, J.; Batelaan, O.

2014-10-01

408

Lognormal distribution is often used as a default model for regression analysis of particle size distribution (PSD) data; however, its goodness-of-fit to particle matter (PM) sampled from animal buildings and its comparison to other PSD models have not been well examined. This study aimed to evaluate and to compare the goodness-of-fit of six PSD models to total suspended particulate matter (TSP) samples collected from 15 animal buildings. Four particle size analyzers were used for PSD measurement. The models' goodness-of-fit was evaluated based on adjusted R2, Akaike's information criterion (AIC), and mean squared error (MSE) values. Results showed that the models' approximation of measured PSDs differed with particle size analyzer. The lognormal distribution model offered overall good approximations to measured PSD data, but was inferior to the gamma and Weibull distribution models when applied to PSD data derived from the Horiba and Malvern analyzers. Single-variable models including the exponential, Khrgian-Mazin, and Chen's empirical models provided relatively poor approximations and, thus, were not recommended for future investigations. A further examination on model-predicted PSD parameters revealed that even the best-fit model of the six could significantly misestimate mean diameter median diameter; and variance. However, compared with other models, the best-fit model still offered the relatively best estimates of mean and median diameters, whereas the best predicted variances were given by the gamma distribution model. PMID:22788111

Yang, Xufei; Lee, Jongmin; Barker, Douglas E; Wang, Xinlei; Zhang, Yuanhui

2012-06-01

409

The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

Mirzabozorg, H; Hariri-Ardebili, M A; Shirkhan, M; Seyed-Kolbadi, S M

2014-01-01

410

Analysis of large-scale distributed knowledge sources via autonomous cooperative graph mining

NASA Astrophysics Data System (ADS)

In this paper, we present a model for processing distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures, and communication bottlenecks. Our model exploits dependencies in the data to enable collaborative distributed querying in noisy data. The collaboration policy for computational resources is efficiently constructed from the belief propagation algorithm. To scale to large data sizes, we employ a combination of priority-based filtering, incremental processing, and communication compression techniques. Our solution achieved high accuracy of analysis results and orders of magnitude improvements in computation time compared to the centralized graph matching solution.

Levchuk, Georgiy; Ortiz, Andres; Yan, Xifeng

2014-05-01

411

The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

Mirzabozorg, H.; Hariri-Ardebili, M. A.; Shirkhan, M.; Seyed-Kolbadi, S. M.

2014-01-01

412

NASA Astrophysics Data System (ADS)

Statistical and fractal properties of the spatial distribution of earthquakes in the central zone of Chile are studied. In particular, data are shown to behave according to the well-known Gutenberg-Richter law. The fractal structure is evident for epicenters, not for hypocenters. The multifractal spectrum is also determined, both for the spatial distribution of epicenters and hypocenters. For negative values of the index of multifractal measure q, the multifractal spectrum, which usually cannot be reliably found from data, is calculated from a generalized Cantor-set model, which fits the multifractal spectrum for q>0, a technique which has been previously applied for analysis of solar wind data.

Pastén, Denisse; Muñoz, Víctor; Cisternas, Armando; Rogan, José; Valdivia, Juan Alejandro

2011-12-01

413

Empirical analysis of the rank distribution of relevant documents in web search Shen Jiang, Sandra the rank distribution of relevant documents in web search. From a methodological point of view a new about the actual rank distribution of relevant documents in web search, with several consequences

Holte, Robert

414

Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

Weitzel, E.; Hoeschele, M.

2014-09-01

415

Strain analysis from objects with a random distribution: A generalized center-to-center method

NASA Astrophysics Data System (ADS)

Existing methods of strain analysis such as the center-to-center method and the Fry method estimate strain from the spatial relationship between point objects in the deformed state. They assume a truncated Poisson distribution of point objects in the pre-deformed state. Significant deviations occur in nature and diffuse the central vacancy in a Fry plot, limiting the its effectiveness as a strain gauge. Therefore, a generalized center-to-center method is proposed to deal with point objects with the more general Poisson distribution, where the method outcomes do not depend on an analysis of a graphical central vacancy. This new method relies upon the probability mass function for the Poisson distribution, and adopts the maximum likelihood function method to solve for strain. The feasibility of the method is demonstrated by applying it to artificial data sets generated for known strains. Further analysis of these sets by use of the bootstrap method shows that the accuracy of the strain estimate has a strong tendency to increase either with point number or with the inclusion of more pre-deformation nearest neighbors. A poorly sorted, well packed, deformed conglomerate is analyzed, yielding strain estimate similar to the vector mean of the major axis directions of pebbles and the harmonic mean of their axial ratios from a shape-based strain determination method. These outcomes support the applicability of the new method to the analysis of deformed rocks with appropriate strain markers.

Shan, Yehua; Liang, Xinquan

2014-03-01

416

Gaussian distributions on Lie groups and their application to statistical shape analysis.

The Gaussian distribution is the basis for many methods used in the statistical analysis of shape. One such method is principal component analysis, which has proven to be a powerful technique for describing the geometric variability of a population of objects. The Gaussian framework is well understood when the data being studied are elements of a Euclidean vector space. This is the case for geometric objects that are described by landmarks or dense collections of boundary points. We have been using medial representations, or m-reps, for modelling the geometry of anatomical objects. The medial parameters are not elements of a Euclidean space, and thus standard PCA is not applicable. In our previous work we have shown that the m-rep model parameters are instead elements of a Lie group. In this paper we develop the notion of a Gaussian distribution on this Lie group. We then derive the maximum likelihood estimates of the mean and the covariance of this distribution. Analogous to principal component analysis of covariance in Euclidean spaces, we define principal geodesic analysis on Lie groups for the study of anatomical variability in medially-defined objects. Results of applying this framework on a population of hippocampi in a schizophrenia study are presented. PMID:15344479

Fletcher, P Thomas; Joshi, Sarang; Lu, Conglin; Pizer, Stephen

2003-07-01

417

NASA Technical Reports Server (NTRS)

Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

Lefebvre, D. R.; Sanderson, A. C.

1994-01-01

418

MIMO RADAR PERFORMANCE ANALYSIS UNDER K-DISTRIBUTED CLUTTER Xin Zhang, Mohammed Nabil El Korso radar context, i.e., the minimum angular separation required to resolve two closely-spaced targets. Due-distributed clutter, performance analysis, CramÂ´er-Rao bound, resolution limit, MIMO radar. 1. INTRODUCTION Multiple

419

The electric generating capacity of Turkey must be tripled by 2010 to meet Turkey’s electric power consumption, if the annual 8% growth in electric power consumption continues. Turkey has to make use of its renewable energy resources, such as wind and solar, not only to meet the increasing energy demand, but also for environmental reasons. Studies show that Iskenderun (36°35?N;

Ali Naci Celik

2004-01-01

420

NASA Astrophysics Data System (ADS)

In this study, we have investigated real-time decoding feasibility of magnetic micro-barcodes in a microfluidic channel by using numerical analysis of magnetic field distribution of the micro-barcodes. The vector potential model based on a molecular current has been used to obtain magnetic stray field distribution of ferromagnetic bars which consisting of the micro-barcodes. It reveals that the stray field distribution of the micro-barcodes strongly depends on the geometries of the ferromagnetic bar. Interestingly enough, we have found that one can avoide the miniaturization process of a magnetic sensor device needed to increase the sensitivity by optimizing the geometries of micro-barcodes. We also estimate a magnetic sensor response depending on flying height and lateral misalignment of the micro-barcodes over the sensor position and found that control of the flying height is crucial factor to enhance the detection sensitivity and reproducibility of a magnetic sensor signal in the suspension assay technology.

Thanh Son, Vo; Anandakumar, S.; Kim, CheolGi; Jeong, Jong-Ruyl

2011-12-01

421

Principal Effects of Axial Load on Moment-Distribution Analysis of Rigid Structures

NASA Technical Reports Server (NTRS)

This thesis presents the method of moment distribution modified to include the effect of axial load upon the bending moments. This modification makes it possible to analyze accurately complex structures, such as rigid fuselage trusses, that heretofore had to be analyzed by approximate formulas and empirical rules. The method is simple enough to be practicable even for complex structures, and it gives a means of analysis for continuous beams that is simpler than the extended three-moment equation now in common use. When the effect of axial load is included, it is found that the basic principles of moment distribution remain unchanged, the only difference being that the factors used, instead of being constants for a given member, become functions of the axial load. Formulas have been developed for these factors, and curves plotted so that their applications requires no more work than moment distribution without axial load. Simple problems have been included to illustrate the use of the curves.

James, Benjamin Wylie

1935-01-01

422

NASA Astrophysics Data System (ADS)

Distributed innovation processes are considered as a new option to handle both the complexity and the speed in which new products and services need to be prepared. Indeed most research on innovation processes was focused on multinational companies with an intra-organisational perspective. The phenomena of innovation processes in networks - with an inter-organisational perspective - have been almost neglected. Collaborative networks present a perfect playground for such distributed innovation processes whereas the authors highlight in specific Virtual Organisation because of their dynamic behaviour. Research activities supporting distributed innovation processes in VO are rather new so that little knowledge about the management of such research is available. With the presentation of the collaborative network relationship analysis this gap will be addressed. It will be shown that a qualitative planning of collaboration intensities can support real business cases by proving knowledge and planning data.

Eschenbächer, Jens; Seifert, Marcus; Thoben, Klaus-Dieter

423

The analysis of counting and catching errors of both catching and non-catching types of rain intensity gauges was recently possible over a wide variety of measuring principles and instrument design solutions, based on the work performed during the recent Field Intercomparison of Rainfall Intensity Gauges promoted by World Meteorological Organization (WMO). The analysis reported here concerns the assessment of accuracy and precision of various types of instruments based on extensive calibration tests performed in the laboratory during the first phase of this WMO Intercomparison. The non-parametric analysis of relative errors allowed us to conclude that the accuracy of the investigated RI gauges is generally high, after assuming that it should be at least contained within the limits set forth by WMO in this respect. The measuring principle exploited by the instrument is generally not very decisive in obtaining such good results in the laboratory. Rather, the attention paid by the manufacturer to suitably accounting and correcting for systematic errors and time-constant related effects was demonstrated to be influential. The analysis of precision showed that the observed frequency distribution of relative errors around their mean value is not indicative of an underlying Gaussian population, being much more peaked in most cases than can be expected from samples extracted from a Gaussian distribution. The analysis of variance (one-way ANOVA), assuming the instrument model as the only potentially affecting factor, does not confirm the hypothesis of a single common underlying distribution for all instruments. Pair-wise multiple comparison analysis revealed cases in which significant differences could be observed. PMID:22546787

Lanza, L G; Stagi, L

2012-01-01

424

Development of a Web Service for Analysis in a Distributed Network

Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

2014-01-01

425

Site-energy distribution analysis of organic chemical sorption by soil organic matter

Sorption of several hydrophobic organic compounds by selected soils, and their humic substance fractions as well, was examined using batch equilibration methods. The results could not be explained by the well known partitioning mechanism alone, but were consistent with the dual-mode sorption model for soil organic matter (SOM) in which both solid-phase dissolution and hole-filling mechanisms take place. The heterogeneous nature of the natural sorbents was demonstrated by site-energy distributions derived from the common Freundlich model. The site-energy distribution analysis is useful for examining and understanding the energetic characteristics of a sorbent. This analysis lends further support for the dual-mode model of sorption to SOM.

Yuan, G.; Xing, B.

1999-07-01

426

Passive-scheme analysis for solving untrusted source problem in quantum key distribution

As a practical method, the passive scheme is useful to monitor the photon statistics of an untrusted source in a "Plug & Play" quantum key distribution (QKD) system. In a passive scheme, three kinds of monitor mode can be adopted: average photon number (APN) monitor, photon number analyzer (PNA) and photon number distribution (PND) monitor. In this paper, the security analysis is rigorously given for APN monitor, while for PNA, the analysis including statistical fluctuation and random noise, is addressed with a confidence level. The results show that the PNA can achieve better performance than the APN monitor and can asymptotically approach the theoretical limit of the PND monitor. Also, the passive scheme with the PNA works efficiently when the signal-to-noise ratio ($R^{SN}$) is not too low and so is highly applicable to solve the untrusted source problem in the QKD system.

Xiang Peng; Bingjie Xu; Hong Guo

2010-04-28

427

Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants

This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.

Simion, G.P. [Science Applications International Corp., Albuquerque, NM (United States); VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Bulmahn, K.D. [SCIENTECH, Inc., Idaho Falls, ID (United States)

1993-06-01

428

Impact of hadronic and nuclear corrections on global analysis of spin-dependent parton distributions

We present the first results of a new global next-to-leading order analysis of spin-dependent parton distribution functions from the most recent world data on inclusive polarized deep-inelastic scattering, focusing in particular on the large-x and low-Q^2 regions. By directly fitting polarization asymmetries we eliminate biases introduced by using polarized structure function data extracted under nonuniform assumptions for the unpolarized structure functions. For analysis of the large-x data we implement nuclear smearing corrections for deuterium and 3He nuclei, and systematically include target mass and higher twist corrections to the g_1 and g_2 structure functions at low Q^2. We also explore the effects of Q^2 and W^2 cuts in the data sets, and the potential impact of future data on the behavior of the spin-dependent parton distributions at large x.

Jimenez-Delgado, Pedro [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Accardi, Alberto [Hampton University, Hampton, VA (United States); Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Melnitchouk, Wally [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

2014-02-01

429

Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

2013-01-01

430

A new parallel-vector finite element analysis software on distributed-memory computers

NASA Technical Reports Server (NTRS)

A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

Qin, Jiangning; Nguyen, Duc T.

1993-01-01

431

The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

2007-01-01

432

Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

Quinlan, D; Barany, G; Panas, T

2007-08-30

433

Recovering the spectral distribution of the illumination from spectral data by highlight analysis

NASA Astrophysics Data System (ADS)

The problem of color constancy for discounting illumination color to obtain the apparent color of the object has been the topic of much research in computer vision. By assuming the neutral interface reflection and dichromatic reflection with highlights (i.e. highlights have the same color as the illuminant) various methods have been proposed aiming at recovering the illuminant color from color highlight analysis. In general, these methods are based on three color stimuli to approximate color. In this contribution, we estimate the spectral distribution from surface reflection using spectral information obtained by a spectrograph. The imaging spectrograph provides a spectral range at each pixel covering the visible wavelength range. Our method differ from existing methods by using a robust clustering technique to obtain the body and surface components in a multi-spectral space. These components determine the direction of the illumination spectral color. Then, we recover the illumination spectral power distribution by using principal component analysis for all wavelengths. To obtain the most reliable estimate of the spectral power distribution of the illuminant, all possible combinations of wavelengths are used to generate the optimal averaged estimation of the spectral power distribution of the scene illuminant. Our method is restricted to images containing a substantial amount of body reflection and highlights.

Stokman, Harro M.; Gevers, Theo

1999-09-01

434

Single-phase power distribution system power flow and fault analysis

NASA Technical Reports Server (NTRS)

Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

Halpin, S. M.; Grigsby, L. L.

1992-01-01

435

Some physics and system issues in the security analysis of quantum key distribution protocols

NASA Astrophysics Data System (ADS)

In this paper, we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks, which are not accounted for in the security analysis and proofs. Hence, the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

Yuen, Horace P.

2014-10-01

436

This paper describes the implementation of the instability analysis of wave growth on liquid jet surface, and maximum entropy\\u000a principle (MEP) for prediction of droplet diameter distribution in primary breakup region. The early stage of the primary\\u000a breakup, which contains the growth of wave on liquid–gas interface, is deterministic; whereas the droplet formation stage\\u000a at the end of primary breakup

E. Movahednejad; F. Ommi; S. M. Hosseinalipour; C. P. Chen; S. A. Mahdavi

437

CobWeb: a proactive analysis-driven approach to content distribution

CobWeb is an open-access content distribution network (CDN) that provides low latency lookups, resilience to flash crowds, and optimal utilization of network resources. Unlike traditional Web caches and CDNs, which rely on ad hoc heuristics for replica placement and cache management, CobWeb achieves superior performance through a unique analysis driven approach. CobWeb derives the optimal replica placement strategy by posing

Yee Jiun Song; Venugopalan Ramasubramanian; Emin Gün Sirer

2005-01-01

438

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Cyrus Cooper; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; David Hadley; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Johanna Kuusisto; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Anna Nordstrom; Peter Nordstrom; Ben A. Oostra; Colin N. A. Palmer; John F. Peden; Inga Prokopenko; Frida Renstrom; Aimo Ruokonen; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Heather M. Stringham; Amy J. Swift; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield

2009-01-01

439

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Valgerdur Steinthorsdottir; Lu Qi; Elizabeth K. Speliotes; Gudmar Thorleifsson; Cristen J. Willer; Blanca M. Herrera; Anne U. Jackson; Noha Lim; Paul Scheet; Nicole Soranzo; Najaf Amin; Yurii S. Aulchenko; John C. Chambers; Alexander Drong; Jianan Luan; Helen N. Lyon; Fernando Rivadeneira; Serena Sanna; Nicholas J. Timpson; M. Carola Zillikens; Jing Hua Zhao; Peter Almgren; Stefania Bandinelli; Amanda J. Bennett; Richard N. Bergman; Lori L. Bonnycastle; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Peter Chines; Lachlan Coin; Cyrus Cooper; Gabriel Crawford; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Shah Ebrahim; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; Candace Guiducci; David Hadley; Anders Hamsten; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Fredrik Karpe; Johanna Kuusisto; Jaana Laitinen; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Narisu Narisu; Anna Nordström; Peter Nordström; Ben A. Oostra; Colin N. A. Palmer; Felicity Payne; John F. Peden; Inga Prokopenko; Frida Renström; Aimo Ruokonen; Veikko Salomaa; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Kaisa Silander; Kijoung Song; Xin Yuan; Heather M. Stringham; Amy J. Swift; Tiinamaija Tuomi; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield; Francis S. Collins; George Davey Smith; Ian N. M. Day; Paul W. Franks; Andrew T. Hattersley; Frank B. Hu; Marjo-Riitta Jarvelin; Augustine Kong; Jaspal S. Kooner; Markku Laakso; Edward Lakatta; Vincent Mooser; Andrew D. Morris; Leena Peltonen; Nilesh J. Samani; Timothy D. Spector; David P. Strachan; Toshiko Tanaka; Jaakko Tuomilehto; André G. Uitterlinden; Cornelia M. van Duijn; Nicholas J. Wareham; Hugh Watkins for the PROCARDIS consortia; Dawn M. Waterworth; Michael Boehnke; Panos Deloukas; Leif Groop; David J. Hunter; Unnur Thorsteinsdottir; David Schlessinger; H.-Erich Wichmann; Timothy M. Frayling; Gonçalo R. Abecasis; Joel N. Hirschhorn; Ruth J. F. Loos; Kari Stefansson; Karen L. Mohlke; Inês Barroso

2009-01-01

440

Distribution Trees' Analysis of PULSE, an Unstructured P2P Live Streaming System

Distribution Trees' Analysis of PULSE, an Unstructured P2P Live Streaming System Diego Perino protocole de streaming live, basÃ© sur un rÃ©seau dÃ©centralisÃ© non-structurÃ©, qui a Ã©tÃ© testÃ© avec succÃ¨s sur Ã©tudions Ã©galement le delai de rÃ©ception d'un stream diffusÃ© par PULSE, et montrons qu'il est comparable Ã

Paris-Sud XI, UniversitÃ© de

441

The electrical conductivity sigma of MgO single crystals shows a sharp increase at 500-800°C, in particular of sigma surface, generally attributed to surface contamination. Charge Distribution Analysis (CDA), a new technique providing information on fundamental properties that was previously unavailable, allows for the determination of surface charges, their sign and associated internal electric field. Data on 99.99% purity, arc-fusion grown

Friedemann Freund; Minoru M. Freund; François Batllo

1993-01-01

442

NASA Astrophysics Data System (ADS)

Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised) with respect to model inputs. In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations) but didactic application case. It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run) and the singular value decomposition (SVD) of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation. For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers) is adopted. Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting.

Castaings, W.; Dartus, D.; Le Dimet, F.-X.; Saulnier, G.-M.

2009-04-01

443

The examination of brain tumor growth and its variability among cancer patients is an important aspect of epidemiologic and medical data. Several studies for tumors of brain interpreted descriptive data, in this study we perform inference in the extent possible, suggesting possible explanations for the differentiation in the survival rates apparent in the epidemiologic data. Population based information from nine registries in the USA are classified with respect to age, gender, race and tumor histology to study tumor size variation. The Weibull and Dagum distributions are fitted to the highly skewed tumor sizes distributions, the parametric analysis of the tumor sizes showed significant differentiation between sexes, increased skewness for both the male and female populations, as well as decreased kurtosis for the black female population. The effect of population characteristics on the distribution of tumor sizes is estimated by quantile regression model and then compared with the ordinary least squares results. The higher quantiles of the distribution of tumor sizes for whites are significantly higher than those of other races. Our model predicted that the effect of age in the lower quantiles of the tumor sizes distribution is negative given the variables race and sex. We apply probability and regression models to explore the effects of demographic and histology types and observe significant racial and gender differences in the form of the distributions. Efforts are made to link tumor size data with available survival rates in relation to other prognostic variables. PMID:23675268

Pokhrel, Keshav P.; Vovoras, Dimitrios; Tsokos, Chris P.

2012-01-01

444

Phylogenetic analysis and subgenotypic distribution of the hepatitis B virus in Recife, Brazil.

The analysis of the genomes of the hepatitis B virus in human hosts identifies phylogenetic variants called viral genotypes. Indeed, clinical and epidemiological observations suggest that differences in viral genotypes lead to distinct biological and clinical behaviors. The aim of this study was to evaluate the subgenotypic distribution and to conduct a phylogenetic analysis by Bayesian method of the hepatitis B virus (HBV) in patients from Recife, Brazil. From July 2009 to December 2010, 60 HBV infected patients were examined, 39 (65%) were males, whose mean age was 50years old. 33 (55%) were genotyped by obtaining and amplifying a 1306bp fragment comprising part of the DNA polymerase and the surface antigen of the HBV. The sequencing was performed on an ABI 3500 Automatic Sequencer and the consensus sequences were obtained by aligning both the sequenced strands (clockwise and anti-clockwise) using SEQUENCHER software. Phylogenetic analysis was conducted using the Markov Chain Monte Carlo simulation implemented by Bayesian evolutionary method by sampling trees. The following subgenotypic distribution was observed: A1 (79%), F2a (12%), A2 (6%) and F4 (3%) as was that those identified as subgenotype A1 were in the same cluster in the phylogenetic tree. In this study, the majority of patients presented the A1 subgenotype from the same viral strain. As per the distribution in the phylogenetic tree by Bayesian method, possibly this subgenotype was in the genetic make-up of Africans brought in centuries past to Brazil as slaves. PMID:23268113

Moura, Izolda Fernandes; Lopes, Edmundo Pessoa; Alvarado-Mora, Mónica Viviana; Pinho, João Renato; Carrilho, Flair José

2013-03-01

445

A class of distribution-free models for longitudinal mediation analysis.

Mediation analysis constitutes an important part of treatment study to identify the mechanisms by which an intervention achieves its effect. Structural equation model (SEM) is a popular framework for modeling such causal relationship. However, current methods impose various restrictions on the study designs and data distributions, limiting the utility of the information they provide in real study applications. In particular, in longitudinal studies missing data is commonly addressed under the assumption of missing at random (MAR), where current methods are unable to handle such missing data if parametric assumptions are violated.In this paper, we propose a new, robust approach to address the limitations of current SEM within the context of longitudinal mediation analysis by utilizing a class of functional response models (FRM). Being distribution-free, the FRM-based approach does not impose any parametric assumption on data distributions. In addition, by extending the inverse probability weighted (IPW) estimates to the current context, the FRM-based SEM provides valid inference for longitudinal mediation analysis under the two most popular missing data mechanisms; missing completely at random (MCAR) and missing at random (MAR). We illustrate the approach with both real and simulated data. PMID:24271505

Gunzler, D; Tang, W; Lu, N; Wu, P; Tu, X M

2014-10-01

446

Space station electrical power distribution analysis using a load flow approach

NASA Technical Reports Server (NTRS)

The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

Emanuel, Ervin M.

1987-01-01

447

We characterized the relationship between the genetic diversity of indigenous soybean-nodulating bradyrhizobia from weakly acidic soils in Japan and their geographical distribution in an ecological study of indigenous soybean rhizobia. We isolated bradyrhizobia from three kinds of Rj-genotype soybeans. Their genetic diversity and community structure were analyzed by PCR-RFLP analysis of the 16S–23S rRNA gene internal transcribed spacer (ITS) region with 11 Bradyrhizobium USDA strains as references. We used data from the present study and previous studies to carry out mathematical ecological analyses, multidimensional scaling analysis with the Bray-Curtis index, polar ordination analysis, and multiple regression analyses to characterize the relationship between soybean-nodulating bradyrhizobial community structures and their geographical distribution. The mathematical ecological approaches used in this study demonstrated the presence of ecological niches and suggested the geographical distribution of soybean-nodulating bradyrhizobia to be a function of latitude and the related climate, with clusters in the order Bj123, Bj110, Bj6, and Be76 from north to south in Japan. PMID:24240318

Saeki, Yuichi; Shiro, Sokichi; Tajima, Toshiyuki; Yamamoto, Akihiro; Sameshima-Saito, Reiko; Sato, Takashi; Yamakawa, Takeo

2013-01-01

448

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

Wilhelmson, Robert B.

1994-01-01

449

Extracted Fragment Ion Mobility Distributions: A New Method for Complex Mixture Analysis

A new method is presented for constructing ion mobility distributions of precursor ions based upon the extraction of drift time distributions that are monitored for selected fragment ions. The approach is demonstrated with a recently designed instrument that combines ion mobility spectrometry (IMS) with ion trap mass spectrometry (MS) and ion fragmentation, as shown in a recent publication [J. Am. Soc. Mass Spectrom. 22 (2011) 1477–1485]. Here, we illustrate the method by examining selected charge states of electrosprayed ubiquitin ions, an extract from diesel fuel, and a mixture of phosphorylated peptide isomers. For ubiquitin ions, extraction of all drift times over small mass-to-charge (m/z) ranges corresponding to unique fragments of a given charge state allows the determination of precursor ion mobility distributions. A second example of the utility of the approach includes the distinguishing of precursor ion mobility distributions for isobaric, basic components from commercially available diesel fuel. Extraction of data for a single fragment ion is sufficient to distinguish the precursor ion mobility distribution of cycloalkyl-pyridine derivatives from pyrindan derivatives. Finally, the method is applied for the analysis of phosphopeptide isomers (LFpTGHPESLER and LFTGHPEpSLER) in a mixture. The approach alleviates several analytical challenges that include separation and characterization of species having similar (or identical) m/z values within complex mixtures. PMID:22518092

Lee, Sunyoung; Li, Zhiyu; Valentine, Stephen J.; Zucker, Steven M.; Webber, Nathaniel; Reilly, James P.; Clemmer, David E.

2011-01-01

450

Extracted Fragment Ion Mobility Distributions: A New Method for Complex Mixture Analysis.

A new method is presented for constructing ion mobility distributions of precursor ions based upon the extraction of drift time distributions that are monitored for selected fragment ions. The approach is demonstrated with a recently designed instrument that combines ion mobility spectrometry (IMS) with ion trap mass spectrometry (MS) and ion fragmentation, as shown in a recent publication [J. Am. Soc. Mass Spectrom. 22 (2011) 1477-1485]. Here, we illustrate the method by examining selected charge states of electrosprayed ubiquitin ions, an extract from diesel fuel, and a mixture of phosphorylated peptide isomers. For ubiquitin ions, extraction of all drift times over small mass-to-charge (m/z) ranges corresponding to unique fragments of a given charge state allows the determination of precursor ion mobility distributions. A second example of the utility of the approach includes the distinguishing of precursor ion mobility distributions for isobaric, basic components from commercially available diesel fuel. Extraction of data for a single fragment ion is sufficient to distinguish the precursor ion mobility distribution of cycloalkyl-pyridine derivatives from pyrindan derivatives. Finally, the method is applied for the analysis of phosphopeptide isomers (LFpTGHPESLER and LFTGHPEpSLER) in a mixture. The approach alleviates several analytical challenges that include separation and characterization of species having similar (or identical) m/z values within complex mixtures. PMID:22518092

Lee, Sunyoung; Li, Zhiyu; Valentine, Stephen J; Zucker, Steven M; Webber, Nathaniel; Reilly, James P; Clemmer, David E

2012-01-01

451

CARES/PC - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

NASA Technical Reports Server (NTRS)

The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided when the maximum likelihood technique is used. CARES/PC is written and compiled with the Microsoft FORTRAN v5.0 compiler using the VAX FORTRAN extensions and dynamic array allocation supported by this compiler for the IBM/MS-DOS or OS/2 operating systems. The dynamic array allocation routines allow the user to match the number of fracture sets and test specimens to the memory available. Machine requirements include IBM PC compatibles with optional math coprocessor. Program output is designed to fit 80-column format printers. Executables for both DOS and OS/2 are provided. CARES/PC is distributed on one 5.25 inch 360K MS-DOS format diskette in compressed format. The expansion tool PKUNZIP.EXE is supplied on the diskette. CARES/PC was developed in 1990. IBM PC and OS/2 are trademarks of International Business Machines. MS-DOS and MS OS/2 are trademarks of Microsoft Corporation. VAX is a trademark of Digital Equipment Corporation.

Szatmary, S. A.

1994-01-01

452

NASA Astrophysics Data System (ADS)

As the aerospace industry increases its usage of composite materials in primary structures, techniques must be developed to nondestructively predict and monitor structural integrity at low proof stresses. This paper demonstrates the feasibility of predicting ultimate strengths at stress levels less than 25 percent of the expected ultimate strength, thereby reducing the unintentional structural damage caused by higher proof loads. The research presented herein has shown that an ultimate strength prediction equation can be generated for ASTM D-3039 unidirectional graphite/epoxy tensile specimens. From an original sample set of six specimens, a multivariate statistical analysis was used to generate an ultimate strength prediction equation. The variables of the multivariate statistical analysis were obtained through the mathematical modelling of the low amplitude (matrix cracking) portion of the specimens' AE amplitude distributions produced during the early stages of proof testing. A Weibull distribution was used to represent the amplitude band, and its parameters were correlated with known failure strengths to produce ultimate strength prediction equations. Ultimate strengths were then accurately predicted at proof stresses less than 25 percent of the expected failure stress for several randomly drawn tensile coupons.

Walker, James L., II; Hill, Eric V. K.

453

Likelihood-based methods are commonplace in phylogenetic systematics. Although much effort has been directed toward likelihood-based models for molecular data, comparatively less work has addressed models for discrete morphological character (DMC) data. Among-character rate variation (ACRV) may confound phylogenetic analysis, but there have been few analyses of the magnitude and distribution of rate heterogeneity among DMCs. Using 76 data sets covering a range of plants, invertebrate, and vertebrate animals, we used a modified version of MrBayes to test equal, gamma-distributed and lognormally distributed models of ACRV, integrating across phylogenetic uncertainty using Bayesian model selection. We found that in approximately 80% of data sets, unequal-rates models outperformed equal-rates models, especially among larger data sets. Moreover, although most data sets were equivocal, more data sets favored the lognormal rate distribution relative to the gamma rate distribution, lending some support for more complex character correlations than in molecular data. Parsimony estimation of the underlying rate distributions in several data sets suggests that the lognormal distribution is preferred when there are many slowly evolving characters and fewer quickly evolving characters. The commonly adopted four rate category discrete approximation used for molecular data was found to be sufficient to approximate a gamma rate distribution with discrete characters. However, among the two data sets tested that favored a lognormal rate distribution, the continuous distribution was better approximated with at least eight discrete rate categories. Although the effect of rate model on the estimation of topology was difficult to assess across all data sets, it appeared relatively minor between the unequal-rates models for the one data set examined carefully. As in molecular analyses, we argue that researchers should test and adopt the most appropriate model of rate variation for the data set in question. As discrete characters are increasingly used in more sophisticated likelihood-based phylogenetic analyses, it is important that these studies be built on the most appropriate and carefully selected underlying models of evolution. PMID:25527198

Harrison, Luke B; Larsson, Hans C E

2015-03-01

454

Nearest-neighbor analysis and the distribution of sinkholes: an introduction to spatial statistics

NSDL National Science Digital Library

This is an exercise I use in an upper-division geomorphology course to introduce students to nearest-neighbor analysis, a basic technique in spatial statistics. Nearest-neighbor analysis is a method of comparing the observed average distance between points and their nearest neighbor to the expected average nearest-neighbor distance in a random pattern of points. The pattern of points on a map or 2-D graph can be classified into three categories: CLUSTERED, RANDOM, REGULAR. Nearest-neighbor analysis provides an objective method for distinguishing among these possible spatial distributions. The technique also produces a population statistic, the nearest-neighbor index, which can be compared from area to area. In general, nearest-neighbor analysis can be applied to any geoscience phenomenon or feature whose spatial distribution can be categorized as a point pattern. The basic distance data can come from topographic maps, aerial photographs, or field measurements. The exercise presented here applies this technique to the study of karst landforms on topographic maps, specifically the spatial distribution of sinkholes. The advantages of introducing nearest-neighbor analysis in an undergraduate lab is that: (1) it reinforces important concepts related to data collection (e.g significant figures), map use (e.g. scale and the UTM grid), and basic statistics (e.g. hypothesis testing); (2) the necessary calculations are easily handled by most students; and (3) once learned, the technique can be widely applied in geoscience problem-solving. Designed for a geomorphology course Addresses student fear of quantitative aspect and/or inadequate quantitative skills

Rick Ford

455

The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and ? parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 ?l/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 ?l/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p?0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. PMID:23273476

de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

2013-03-01