For comprehensive and current results, perform a real-time search at Science.gov.

1

Using Weibull Distribution Analysis to Evaluate ALARA Performance

As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

2009-10-01

2

Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

NASA Technical Reports Server (NTRS)

Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

1992-01-01

3

Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

NASA Technical Reports Server (NTRS)

Described here are nonlinear regression estimators for the three-parameter Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

1992-01-01

4

Modern estimation of the parameters of the Weibull wind speed distribution for wind energy analysis

Three methods for calculating the parameters of the Weibull wind speed distribution for wind energy analysis are presented: the maximum likelihood method, the proposed modified maximum likelihood method, and the commonly used graphical method. The application of each method is demonstrated using a sample wind speed data set, and a comparison of the accuracy of each method is also performed.

J. V. Seguro; T. W. Lambert

2000-01-01

5

Weibull-distributed sea clutter

Weibull-distributed sea clutter was measured using a fixed antenna of an L-band air-route surveillance radar at low grazing angles between 0.50 and 0.72 deg. It is shown that the sea-clutter amplitude statistics obey a Weibull distribution with a shape parameter of 1.585.

Matsuo Sekine; Toshimitsu Musha; Yuichi Tomita; Toshihiko Hagisawa; Takeru Irabu; Eiichi Kiuchi

1983-01-01

6

Weibull-distributed sea clutter

NASA Astrophysics Data System (ADS)

Weibull-distributed sea clutter was measured using a fixed antenna of an L-band air-route surveillance radar at low grazing angles between 0.50 and 0.72 deg. It is shown that the sea-clutter amplitude statistics obey a Weibull distribution with a shape parameter of 1.585.

Sekine, M.; Musha, T.; Tomita, Y.; Hagisawa, T.; Irabu, T.; Kiuchi, E.

1983-08-01

7

Weibull-distributed ground clutter

NASA Astrophysics Data System (ADS)

Weibull-distributed ground clutter of cultivated land was measured using an L-band long-range air-route surveillance radar (ARSR) having a 3.0 micron pulsewidth and a 1.23 deg beamwidth at very low grazing angles between 0.21 degs and 0.32 degs. It is shown that the shape parameter of the Weibull distribution varied from c = 1.507 to c = 2.0, corresponding to the Rayleigh distribution.

Sekine, M.; Ohtani, S.; Musha, T.; Irabu, T.; Kiuchi, E.; Hagisawa, T.; Tomita, Y.

1981-07-01

8

Weibull-distributed ground clutter

Weibull-distributed ground clutter of cultivated land was measured using an L-band long-range air-route surveillance radar (ARSR) having a 3.0 micron pulsewidth and a 1.23 deg beamwidth at very low grazing angles between 0.21 degs and 0.32 degs. It is shown that the shape parameter of the Weibull distribution varied from c = 1.507 to c = 2.0, corresponding to the

M. Sekine; S. Ohtani; T. Musha; T. Irabu; E. Kiuchi; T. Hagisawa; Y. Tomita

1981-01-01

9

The International Standard IEC 61400-12 and other international recommendations suggest the use of the two-parameter Weibull probability distribution function (PDF) to estimate the Annual Energy Production (AEP) of a wind turbine. Most of the commercial software uses the unimodal Weibull PDF as the default option to carry out estimations of AEP, which in turn, are used to optimise wind farm

O. A. Jaramillo; M. A. Borja

2004-01-01

10

The Rosin Rammler Sperling Weibull distribution and its use in the analysis of complex data is explained with reference to metoprolol and acebutolol AUC values and isoniazid plasma concentrations. The technique is then applied to sparteine and debrisoquine data to resolve populations into distinct sub-groups. Goodness of fit is measured by applying the chi 2 test to the untransformed data. The method is simple to use and sub-groups can be identified rapidly. Each sub-group can be characterised by a simple exponential equation. PMID:6653637

Jack, D B

1983-01-01

11

NASA Technical Reports Server (NTRS)

The suitability of the two-parameter Weibull distribution for describing highly censored cat motion sickness latency data was evaluated by estimating the parameters with the maximum likelihood method and testing for goodness of fit with the Kolmogorov-Smirnov statistic. A procedure for determining confidence levels and testing for significance of the difference between Weibull parameters is described. Computer programs for these procedures may be obtained from an archival source.

Park, Won J.; Crampton, George H.

1988-01-01

12

Binary data regression: Weibull distribution

NASA Astrophysics Data System (ADS)

The problem of estimation in binary response data has receivied a great number of alternative statistical solutions. Generalized linear models allow for a wide range of statistical models for regression data. The most used model is the logistic regression, see Hosmer et al. [6]. However, as Chen et al. [5] mentions, when the probability of a given binary response approaches 0 at a different rate than it approaches 1, symmetric linkages are inappropriate. A class of models based on Weibull distribution indexed by three parameters is introduced here. Maximum likelihood methods are employed to estimate the parameters. The objective of the present paper is to show a solution for the estimation problem under the Weibull model. An example showing the quality of the model is illustrated by comparing it with the alternative probit and logit models.

Caron, Renault; Polpo, Adriano

2009-12-01

13

A FORTRAN program for fitting Weibull distribution and generating samples

A computer program has been developed to estimate the parameters of the two-parameter Weibull distribution describing a given data set and to generate random numbers following a given Weibull distribution. The Weibull distribution parameters, namely Weibull modulus and scale parameter, are estimated using two methods: (i) linear regression after logarithmic transformation of the data and (ii) maximum-likelihood estimator. The cumulative

Amitava Ghosh

1999-01-01

14

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

Kranz, Timothy L.

2002-01-01

15

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

Krantz, Timothy L.

2002-01-01

16

Weibull Distribution From Interval Inspection Data

NASA Technical Reports Server (NTRS)

Most likely failure sequence assumed. Memorandum discusses application of Weibull distribution to statistics of failures of turbopump blades. Is generalization of well known exponential random probability distribution and useful in describing component-failure modes including aging effects. Parameters found from experimental data by method of maximum likelihood.

Rheinfurth, Mario H.

1987-01-01

17

Empirical Bayes Estimation in the Weibull Distribution

In part I empirical Bayes estimation procedures are introduced and employed to obtain an estimator for the unknown random scale parameter of a two-parameter Weibull distribution with known shape parameter. In part II, procedures are developed for estimating both the random scale and shape parameters. These estimators use a sequence of maximum likelihood estimates from related reliability experiments to form

D. J. Couture; H. F. Martz

1972-01-01

18

Unification of the Fréchet and Weibull Distribution

Well-known results for the Fréchet and Weibull distribution are streamlined using a unifying parametrisation. Expected values for order statistics follow through a fractional matrix power and the likelihood surface in case of a loglinear specification for the scale parameter is shown to have just two stationary points.

Peter ter Berg

2009-01-01

19

Weibull-distributed ground clutter in the frequency domain

NASA Astrophysics Data System (ADS)

Ground clutter from cultivated land was measured using an L-band long range air-route surveillance radar at very low grazing angles, between 0.21 and 0.32 deg. It is shown that the Weibull-distributed ground clutter obeys a Weibull distribution in the frequency domain after processing by the discrete Fourier transform. Thus the new Weibull/CFAR should be considered to suppress such clutter in the frequency domain.

Sekine, M.; Musha, T.; Tomita, Y.; Hagisawa, T.; Kiuchi, E.

1985-06-01

20

Weibull-distributed ground clutter in the frequency domain

Ground clutter from cultivated land was measured using an L-band long range air-route surveillance radar at very low grazing angles, between 0.21 and 0.32 deg. It is shown that the Weibull-distributed ground clutter obeys a Weibull distribution in the frequency domain after processing by the discrete Fourier transform. Thus the new Weibull\\/CFAR should be considered to suppress such clutter in

M. Sekine; T. Musha; Y. Tomita; T. Hagisawa; E. Kiuchi

1985-01-01

21

Independent Orbiter Assessment (IOA): Weibull analysis report

NASA Technical Reports Server (NTRS)

The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.

Raffaelli, Gary G.

1987-01-01

22

A FORTRAN program for fitting Weibull distribution and generating samples

NASA Astrophysics Data System (ADS)

A computer program has been developed to estimate the parameters of the two-parameter Weibull distribution describing a given data set and to generate random numbers following a given Weibull distribution. The Weibull distribution parameters, namely Weibull modulus and scale parameter, are estimated using two methods: (i) linear regression after logarithmic transformation of the data and (ii) maximum-likelihood estimator. The cumulative distribution function (i.e. probability of failure of a specimen if the data set characterize strength of a component or probability of drought if the data represent water level, etc.) has been estimated using the order statistics. Both the Weibull modulus and scale parameter are necessary to generate random numbers with a specified Weibull distribution. Two example problems using published data sets are used to verify the code. Values of Weibull distribution parameters, estimated using this code, match published solutions obtained graphically. Parameters estimated from one of the example problems are used to generate a set of random numbers. Descriptive statistics and Weibull parameters estimated from the generated data match well with those obtained from the original data.

Ghosh, Amitava

1999-08-01

23

Is the Weibull distribution really suited for wind statistics modeling?

NASA Astrophysics Data System (ADS)

Wind speed statistics is generally modeled using the Weibull distribution. This distribution is convenient since it fully characterizes analytically with only two parameters (the shape and scale parameters) the shape of distribution and the different moments of the wind speed. However, the Weibull distribution is based on empirical rather than physical justification and might display strong limitations for its applications. In this article, we analyze wind speed distributions of meteorological stations and report that they deviate from the Weibull distribution. We further investigate wind components rather than the wind speed statistic. This approach provides more physical insights on the validity domain of the Weibull distribution as a possible relevant model for wind statistics and the quantification of the error made by using such a distribution. We thereby propose alternative expressions of more suited wind speed distribution by using super-statistical distributions.

Drobinski, Philippe; Coulais, Corentin; Jourdier, Bénédicte

2014-05-01

24

Estimation of Wind Power Potential Using Weibull Distribution

The main objective of the present study is to estimate wind power potential using the two Weibull parameters of the wind speed distribution function, the shape parameter k (dimensionless) and the scale parameter c (m\\/s). In this regard, a methodology that uses three various techniques (maximum likelihood, least squares, and method of moments) for estimating the Weibull parameters was given

Asir Genc; Murat Erisoglu; Ahmet Pekgor; Galip Oturanc; Arif Hepbasli; Koray Ulgen

2005-01-01

25

Program for Weibull Analysis of Fatigue Data

NASA Technical Reports Server (NTRS)

A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

Krantz, Timothy L.

2005-01-01

26

A wind energy analysis of Grenada: an estimation using the ‘Weibull’ density function

The Weibull density function has been used to estimate the wind energy potential in Grenada, West Indies. Based on historic recordings of mean hourly wind velocity this analysis shows the importance to incorporate the variation in wind energy potential during diurnal cycles. Wind energy assessments that are based on Weibull distribution using average daily\\/seasonal wind speeds fail to acknowledge that

D Weisser

2003-01-01

27

Weibull distribution based on maximum likelihood with interval inspection data

NASA Technical Reports Server (NTRS)

The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

Rheinfurth, M. H.

1985-01-01

28

Bayesian estimation of life parameters in the Weibull distribution.

NASA Technical Reports Server (NTRS)

Development of a Bayesian analysis of the scale and shape parameters in the Weibull distribution and the corresponding reliability function with respect to the usual life-testing procedures. For the scale parameter theta, Bayesian estimates of theta and reliability are obtained for the uniform, exponential, and inverted gamma prior probability densities. Bhattacharya's results (1967) for the one-parameter exponential life-testing distribution are reduced to a special case of these results. A fully Bayesian analysis of both the scale and shape parameters is developed by assuming independent prior distributions; since in the latter case, analytical tractability is not possible, Bayesian estimates are obtained through a conjunction of Monte Carlo simulation and numerical-integration techniques. In both cases, a computer simulation is carried out, and a comparison is made between the Bayesian and the corresponding minimum-variance unbiased, or maximum likelihood, estimates. As expected, the Bayesian estimates are superior.

Canavos, G. C.; Tsokos, C. P.

1973-01-01

29

Weibull Analysis of Mechanical Data for Castings II: Weibull Mixtures and Their Interpretation

NASA Astrophysics Data System (ADS)

The interpretation of Weibull probability plots of mechanical testing data from castings was discussed in Part 1 (M. Tiryakio?lu, J. Campbell: Metall. Mater. Trans. A, 41 (2010) 3121-3129). In Part II, details about the mathematical models of Weibull mixtures are introduced. The links between the occurrence of Weibull mixtures and casting process parameters are discussed. Worked examples are introduced in five case studies in which six datasets from the literature were reanalyzed. Results show that tensile and fatigue life data should be interpreted differently. In tensile data, Weibull mixtures are due to two distinct defect distributions, namely "old" and "young" bifilms, which are a result of prior processing and mold filling, respectively. "Old" bifilms are the predominant defect and result in the lower distribution, whereas "young" bifilms results on the upper distribution. In fatigue life data, Weibull mixtures are due to two failure mechanisms being active: failure due to cracks initiating from surface defects and interior defects. Surface defects are predominant and interior defects lead to fatigue failure only when there are no cracks initiated by surface defects. In all cases, only the mutually exclusive Weibull mixture model was found to be applicable.

Tiryakio?lu, Murat

2015-01-01

30

Capacity factor and economic analysis for 4 selected turbines (1,000, 1,300, 2,000, and 2,300 kW) with 6 different hub heights (50, 60, 70, 80, 90, and 100 m) have been calculated according to 13 different situations by using real time series and Weibull parameters which have been determined for Çanakkale region in Part 1. As a result of the study,

S. A. Akda?; Ö. Güler

2009-01-01

31

Investigation of Weibull statistics in fracture analysis of cast aluminum

NASA Technical Reports Server (NTRS)

The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

Holland, Frederic A., Jr.; Zaretsky, Erwin V.

1989-01-01

32

Investigation of Weibull statistics in fracture analysis of cast aluminum

NASA Technical Reports Server (NTRS)

The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

Holland, F. A., Jr.; Zaretsky, E. V.

1989-01-01

33

NASA Technical Reports Server (NTRS)

A Bayesian inference process for system logistical planning is presented which provides a method for incorporating actual failures with prediction data for an ongoing and improving reliability estimates. The process uses the Weibull distribution, and provides a means for examining and updating logistical and maintenance support needs.

Giuntini, Michael E.; Giuntini, Ronald E.

1991-01-01

34

Table for estimating parameters of Weibull distribution

NASA Technical Reports Server (NTRS)

Table yields best linear invariant /BLI/ estimates for log of reliable life under censored life tests, permitting reliability estimations in failure analysis of items with multiple flaws. These BLI estimates have uniformly smaller expected loss than Gauss-Markov best linear unbiased estimates.

Mann, N. R.

1971-01-01

35

Bayesian Analysis of Simple Step-stress Model under Weibull Lifetimes

Bayesian Analysis of Simple Step-stress Model under Weibull Lifetimes A. Ganguly1 , D. Kundu2, 3 analysis. 1 Department of Statistics, University of Pune, Pune 411 007, India. 2 Department of Mathematics test(ing) BE Bayes estimate CDF Cumulative distribution function CEM Cumulative exposure model CRI

Kundu, Debasis

36

Weibull-distributed radar clutter reflected from sea ice

NASA Astrophysics Data System (ADS)

Sea-ice clutter was measured using an X-band radar which is located at the city of Mombetsu in Hokkaido. The pulse width of the radar was 80 ns. To sample at 40 ns and record digitally, an emitter-coupled logic (ECL) was used as a high-speed IC. The sampled data were first transferred to a 64-kbyte dynamic-memory board and then to a 5-inch floppy disk through an 8-bit microcomputer. These data were processed by a 16-bit microcomputer. As a result, it is shown that the amplitude of sea ice obeys a Weibull distribution with shape parameters c = 0.50-1.65. Thus the amplitude statistics deviate from the well-known Rayleigh distribution of c = 2.0, in which a logarithmic/constant-false-alarm-rate (LOG/CFAR) circuit is useful. It is concluded that the new Weibull/CFAR should be considered to suppress sea-ice clutter.

Ogawa, Hiroshi; Sekine, Matsuo; Musha, Toshimitsu; Aota, Masaaki; Ohi, Masayuki

1987-02-01

37

Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

NASA Technical Reports Server (NTRS)

The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

Huang, Zhaofeng; Porter, Albert A.

1990-01-01

38

Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

NASA Technical Reports Server (NTRS)

The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

Huang, Zhaofeng; Porter, Albert A.

1991-01-01

39

NASA Astrophysics Data System (ADS)

We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

Goh, Segun; Kwon, H. W.; Choi, M. Y.

2014-06-01

40

Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

NASA Technical Reports Server (NTRS)

A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

2012-01-01

41

Analysis of the clutter map CFAR in Weibull clutter

Non-Rayleigh clutter statistics result when sea or ground clutter are viewed with resolution radars (pulse widh<0.5?s) at low grazing angle (?<5°). In this paper, we consider the problem of Clutter Map Constant False Alarm Rate detection (CMAP-CFAR) in Weibull distribution with a shape parameter known. The target is assumed to be fluctuating according to Swerling I model. Closed-form expressions for

M. Hamadouche; Mourad Barkat; M. Khodja

2000-01-01

42

A comparison of the generalized gamma and exponentiated Weibull distributions.

This paper provides a comparison of the three-parameter exponentiated Weibull (EW) and generalized gamma (GG) distributions. The connection between these two different families is that the hazard functions of both have the four standard shapes (increasing, decreasing, bathtub, and arc shaped), and in fact, the shape of the hazard is the same for identical values of the three parameters. For a given EW distribution, we define a matching GG using simulation and also by matching the 5 (th) , 50 (th) , and 95 (th) percentiles. We compare EW and matching GG distributions graphically and using the Kullback-Leibler distance. We find that the survival functions for the EW and matching GG are graphically indistinguishable, and only the hazard functions can sometimes be seen to be slightly different. The Kullback-Leibler distances are very small and decrease with increasing sample size. We conclude that the similarity between the two distributions is striking, and therefore, the EW represents a convenient alternative to the GG with the identical richness of hazard behavior. More importantly, these results suggest that having the four basic hazard shapes may to some extent be an important structural characteristic of any family of distributions. PMID:24700647

Cox, Christopher; Matheson, Matthew

2014-09-20

43

NASA Technical Reports Server (NTRS)

Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

Gross, Bernard

1996-01-01

44

NASA Astrophysics Data System (ADS)

The two-parameter Weibull distribution (2PWD), similar to an instantaneous unit hydrograph (IUH), is parameterized in terms of Horton catchment ratios on the basis of a geomorphologic model of catchment response. For this the shape and scale parameters of the Weibull distribution are expressed analytically in terms of Horton's catchment ratios. The two parameters of the IUH derived using Nash's model, which is a two-parameter gamma distribution (2PGD), are also expressed analytically in terms of Horton ratios. The performance of the proposed methods is tested for describing a synthetic unit hydrograph (SUH) under limited data conditions. A comparison is made with the unit hydrographs derived from the event data of two real catchments, and with the existing geomorphological based 2PGD for developing SUH given by Rosso (1984). The sensitivity analysis of the 2PWD to the nondimensional parameter ? of the UH (a product of peak discharge and time to peak) shows ? to be more sensitive to the shape parameter a than the scale parameter b. Further examination to find any similarity between the behavior of 2PWD and 2PGD showed that a in 2PWD corresponds to the shape parameter n in the 2PGD, and b behaves similar to the scale parameter k in the 2PGD. Finally, practical applicability of the proposed approach to ungauged catchments is tested using field data.

Bhunya, P. K.; Berndtsson, R.; Singh, P. K.; Hubert, P.

2008-04-01

45

Composite Weibull-Inverse Transformed Gamma distribution and its actuarial application

NASA Astrophysics Data System (ADS)

This paper introduces a new composite model, namely, composite Weibull-Inverse Transformed Gamma distribution which assumes Weibull distribution for the head up to a specified threshold and inverse transformed gamma distribution beyond it. The closed form of probability density function (pdf) as well as the estimation of parameters by maximum likelihood method is presented. The model is compared with several benchmark distributions and their performances are measured. A well-known data set, Danish fire loss data, is used for this purpose and it's Value at Risk (VaR) using the new model is computed. In comparison to several standard models, the composite Weibull- Inverse Transformed Gamma model proved to be a competitor candidate.

Maghsoudi, Mastoureh; Bakar, Shaiful Anuar Abu; Hamzah, Nor Aishah

2014-07-01

46

NASA Astrophysics Data System (ADS)

The statistical properties of fracture strength of brittle and quasibrittle materials are often described in terms of the Weibull distribution. However, the weakest-link hypothesis, commonly used to justify it, is expected to fail when fracture occurs after significant damage accumulation. Here we show that this implies that the Weibull distribution is unstable in a renormalization-group sense for a large class of quasibrittle materials. Our theoretical arguments are supported by numerical simulations of disordered fuse networks. We also find that for brittle materials such as ceramics, the common assumption that the strength distribution can be derived from the distribution of preexisting microcracks by using Griffith's criteria is invalid. We attribute this discrepancy to crack bridging. Our findings raise questions about the applicability of Weibull statistics to most practical cases.

Bertalan, Zsolt; Shekhawat, Ashivni; Sethna, James P.; Zapperi, Stefano

2014-09-01

47

NASA Astrophysics Data System (ADS)

We describe a physically based derivation of the Weibull distribution with respect to fragmentation processes. In this approach we consider the result of a single-event fragmentation leading to a branching tree of cracks that show geometric scale invariance (fractal behavior). With this approach, because the Rosin-Rammler type distribution is just the integral form of the Weibull distribution, it, too, has a physical basis. In further consideration of mass distributions developed by fragmentation processes, we show that one particular mass distribution closely resembles the empirical lognormal distribution. This result suggests that the successful use of the lognormal distribution to describe fragmentation distributions may have been simply fortuitous.

Brown, Wilbur K.; Wohletz, Kenneth H.

1995-08-01

48

Weibull, log-Weibull and K-distributed ground clutter modeling analyzed by AIC

Ground clutter of cultivated land was measured using an L-band long-range air-route surveillance radar (ARSR) having a 3.0 ?s pulsewidth and a 1.23° beamwidth at very low grazing angles between 0.21° and 0.32°. To determine the ground clutter amplitude, we introduce the Akaike Information Criterion (AIC), which is more rigorous fit of the distribution to the data than the least

S. Sayama; H. Sekine

2001-01-01

49

Thirten sample trees of various sizes in a 29-year-old hinoki [Chamaecyparis obtusa (Sieb, et Zucc.) Endl.] plantation were felled and subjected to the stratified clip technique. Crown profile of foliage area fitted well with the Weibull distribution. The crown profile tended to be more skewed toward the top of crowns in smaller trees than in larger trees. This tendency was

Shigeta Moril; Akio Hagihara

1991-01-01

50

An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

ERIC Educational Resources Information Center

An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

2005-01-01

51

Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

2014-01-01

52

Estimating Weibull parameters for materials

NASA Technical Reports Server (NTRS)

The statistical analysis of strength and fracture of materials in general, with application to fiber composites are discussed. The weakest link model is considered in a fairly general form, and the resulting equations are demonstrated by using a Weibull distribution for flaws. This distribution appears naturally in a variety of problems, and therefore additional attention is devoted to analysis and statistical estimation connected with this distribution. Special working charts are included to facilitate interpretation of observed data and estimation of parameters. Implications of the size effect are considered for various kinds of flaw distributions. Failure and damage in a fiber-reinforced system are described. Some useful graphs are included for predicting the strength of such a system. Recent data on organic-fiber (PRD 49) composite material is analyzed by the Weibull distribution with the methods presented.

Robinson, E. Y.

1971-01-01

53

We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

54

The cure fraction models are usually used to model lifetime time data with long-term survivors. In the present article, we introduce a Bayesian analysis of the four-parameter generalized modified Weibull (GMW) distribution in presence of cure fraction, censored data and covariates. In order to include the proportion of "cured" patients, mixture and non-mixture formulation models are considered. To demonstrate the ability of using this model in the analysis of real data, we consider an application to data from patients with gastric adenocarcinoma. Inferences are obtained by using MCMC (Markov Chain Monte Carlo) methods. PMID:24008248

Martinez, Edson Z; Achcar, Jorge A; Jácome, Alexandre A A; Santos, José S

2013-12-01

55

In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or ‘wear out’ failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters ? having

P. L Hall; J. E Strutt

2003-01-01

56

Weibull analysis characterizes the breaking properties of dry-cured ham slices.

The breaking strength (?) and stress-strain relation of several muscles [biceps femoris (BF), semitendinosus (ST) and semimembranosus (SM)] and the subcutaneous fat (SF) from a Spanish dry cured ham (Protected Designation of Origin of white ham from "Teruel") have been analysed by the uniaxial tensile test in order to predict the mechanical behaviour of this meat product. Thirty pieces were analysed and the stress-strain curves were obtained. A great dispersion of the ? values was observed. This leads to the necessity of employing statistical analyses to illustrate the extent to which strength values may vary. The Weibull analysis was applied to estimate the fracture probability. SM and SF showed the highest characteristic strength. The low values of the Weibull modulus indicate that dry-cured ham tissues behave as brittle materials. The stress-strain curves present characteristic forms for BF, ST and SM, which may be associated with their composition and the extent to which they are affected by the curing process. PMID:24769143

Romero de Ávila, M Dolores; Escudero, Rosa; Ordóñez, Juan A; Isabel Cambero, M

2014-08-01

57

Estimating Weibull parameters for composite materials.

NASA Technical Reports Server (NTRS)

This paper deals with the statistical analysis of strength and fracture of materials in general, with application to fiber composites. The 'weakest link' model is considered in a fairly general form, and the resulting equations are demonstrated by using a Weibull distribution for flaws. This distribution appears naturally in a variety of problems, and therefore additional attention is devoted to analysis and statistical estimation connected with this distribution. Special working charts are included to facilitate interpretation of observed data and estimation of parameters. Implications of the size effect are considered for various kinds of flaw distributions. The paper describes failure and damage in a fiber-reinforced systems.

Robinson, E. Y.

1972-01-01

58

A delay metric for RC circuits based on the Weibull distribution

Physical design optimizations such as placement, interconnect synthesis, oorplanning, and routing require fast and accurate analysis of RC networks. Because of its simple close form and fast evaluation, the Elmore delay metric has been widely adopted. The recently proposed delay metrics PRIMO and H-gamma match the rst three circuit moments to the probability density function of a Gamma statistical distribution.

Frank Liu; Chandramouli V. Kashyap; Charles J. Alpert

2002-01-01

59

NASA Technical Reports Server (NTRS)

The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

Wheeler, J. T.

1990-01-01

60

NASA Astrophysics Data System (ADS)

A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.

Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah

2014-11-01

61

The author comments on an article by Choe concerning the use of the Weibull survival function to fit curves to infant and childhood mortality data. "As an alternative to the computer program [employed by Choe], this note gives a simple linearization procedure that is easily executed by hand calculator. This procedure allows one to check the fit of the Weibull curve locally...for a range of ages (e.g. those from which one wishes to extrapolate). It also provides a simple way to find the Weibull curve parameters of optimum local fit." The procedure is illustrated using data for the period 1970-1971 for males and females in Taiwan. PMID:12312465

Luther, N Y

1982-11-01

62

A semi-Markov model based on generalized Weibull distribution with an illustration for HIV disease.

Multi-state stochastic models are useful tools for studying complex dynamics such as chronic diseases. Semi-Markov models explicitly define distributions of waiting times, giving an extension of continuous time and homogeneous Markov models based implicitly on exponential distributions. This paper develops a parametric model adapted to complex medical processes. (i) We introduced a hazard function of waiting times with a U or inverse U shape. (ii) These distributions were specifically selected for each transition. (iii) The vector of covariates was also selected for each transition. We applied this method to the evolution of HIV infected patients. We used a sample of 1244 patients followed up at the hospital in Nice, France. PMID:16450855

Foucher, Yohann; Mathieu, Eve; Saint-Pierre, Philippe; Durand, Jean-François; Daurès, Jean-Pierre

2005-12-01

63

Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

NASA Technical Reports Server (NTRS)

The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

2002-01-01

64

Radar Detection in Weibull Clutter

Radar detection in Weibull clutter is examined from a statistical detection viewpoint. Weibull clutter parameters are determined and related to measured values of land and sea clutter. Optimum performance in Weibull clutter is determined, and practical receivers that approach this performance are identified. Receiver performance in Rayleigh, log-normal, and Weibull clutter is evaluated and compared.

D. C. Schleher

1976-01-01

65

NASA Technical Reports Server (NTRS)

The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

Shantaram, S. Pai; Gyekenyesi, John P.

1989-01-01

66

Finite-size effects on return interval distributions for weakest-link-scaling systems

NASA Astrophysics Data System (ADS)

The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the ?-Weibull distribution. The upper tail of the ?-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the ?-Weibull distribution decreases linearly after a waiting time ?c?n1/m, where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the ? Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the ?-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems.

Hristopulos, Dionissios T.; Petrakis, Manolis P.; Kaniadakis, Giorgio

2014-05-01

67

A mixed-Weibull regression model for the analysis of automotive warranty data

This paper presents a case study regarding the reliability analysis of some automotive components based on field failure warranty data. The components exhibit two different failure modes, namely early and wearout failures, and are mounted on different vehicles, which differ among themselves for car model and engine type, thus involving different operating conditions. Hence, the failure time of each component

Laura Attardi; Maurizio Guida; Gianpaolo Pulcini

2005-01-01

68

Robust Fitting of a Weibull Model with Optional Censoring

The Weibull family is widely used to model failure data, or lifetime data, although the classical two-parameter Weibull distribution is limited to positive data and monotone failure rate. The parameters of the Weibull model are commonly obtained by maximum likelihood estimation; however, it is well-known that this estimator is not robust when dealing with contaminated data. A new robust procedure is introduced to fit a Weibull model by using L2 distance, i.e. integrated square distance, of the Weibull probability density function. The Weibull model is augmented with a weight parameter to robustly deal with contaminated data. Results comparing a maximum likelihood estimator with an L2 estimator are given in this article, based on both simulated and real data sets. It is shown that this new L2 parametric estimation method is more robust and does a better job than maximum likelihood in the newly proposed Weibull model when data are contaminated. The same preference for L2 distance criterion and the new Weibull model also happens for right-censored data with contamination. PMID:23888090

Yang, Jingjing; Scott, David W.

2013-01-01

69

The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

Harris, S.; Gross, R.; Mitchell, E.

2011-01-18

70

A Weibull characterization for tensile fracture of multicomponent brittle fibers

NASA Technical Reports Server (NTRS)

A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

Barrows, R. G.

1977-01-01

71

Weibull crack density coefficient for polydimensional stress states

NASA Technical Reports Server (NTRS)

A structural ceramic analysis and reliability evaluation code has recently been developed encompassing volume and surface flaw induced fracture, modeled by the two-parameter Weibull probability density function. A segment of the software involves computing the Weibull polydimensional stress state crack density coefficient from uniaxial stress experimental fracture data. The relationship of the polydimensional stress coefficient to the uniaxial stress coefficient is derived for a shear-insensitive material with a random surface flaw population.

Gross, Bernard; Gyekenyesi, John P.

1989-01-01

72

Gas turbine safety improvement through risk analysis

This paper is intended to provide the engineer with the information necessary to understand certain statistical methods that are used to improve system safety. It will provide an understanding of Weibull analysis, in that it describes when the Weibull distribution is appropriate, how to construct a Weibull plot, and how to use the parameters of the Weibull distribution to calculate risk. The paper will also provide the engineer with a comprehension of Monte Carlo simulation as it relates to quantifying safety risk. The basic components of Monte Carlo simulation are discussed as well as the formulation of a system model and its application in the gas turbine industry.

Crosby, T.M.; Reinman, G.L.

1988-04-01

73

Performance Sampling and Weibull Distributions.

ERIC Educational Resources Information Center

Concerning their study of Wisconsin school superintendents, the authors comment briefly on small differences between their own tactics for modeling mobility and the tactics used by some others, including Schmittlein and Morrison. (Author/MLF)

March, James C.; March, James G.

1981-01-01

74

Modeling of wind speed variation is an essential requirement in the estimation of the wind energy potential for a typical site. In this paper, the wind energy potential in Ibadan (Lat. 7.43 0 N; Long. 3.9 0 E; Alt. 227.2m) is statistically analyzed using daily wind speed data for 10 years (1995-2004) obtained from the International Institute of Tropical Agriculture

D. A. Fadare

75

A Weibull characterization for tensile fracture of multicomponent brittle fibers

NASA Technical Reports Server (NTRS)

Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

Barrows, R. G.

1977-01-01

76

NASA Technical Reports Server (NTRS)

The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

Pai, Shantaram S.; Gyekenyesi, John P.

1988-01-01

77

The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many\\u000a computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed\\u000a sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents\\u000a an unprecedented

Alessandra Fanfani; M. Anzar Afaq; Jose Afonso Sanches; Julia Andreeva; Giusepppe Bagliesi; Lothar Bauerdick; Stefano Belforte; Patricia Bittencourt Sampaio; Ken Bloom; Barry Blumenfeld; Chris Brew; Marco Calloni; Daniele Cesini; Mattia Cinquilli; Giuseppe Codispoti; Jorgen D’Hondt; Liang Dong; Danilo Dongiovanni; Giacinto Donvito; David Dykstra; Erik Edelmann; Ricky Egeland; Peter Elmer; Giulio Eulisse; Dave Evans; Federica Fanzago; Fabio Farina; Derek Feichtinger; Ian Fisk; Josep Flix; Claudio Grandi; Yuyi Guo; Kalle Happonen; José M. Hernàndez; Chih-Hao Huang; Kejing Kang; Edward Karavakis; Matthias Kasemann; Carlos Kavka; Akram Khan; Bockjoo Kim; Jukka Klem; Jesper Koivumäki; Thomas Kress; Peter Kreuzer; Tibor Kurca; Valentin Kuznetsov; Stefano Lacaprara; Kati Lassila-Perini; James Letts; Tomas Lindén; Lee Lueking; Joris Maes; Nicolò Magini; Gerhild Maier; Patricia Mcbride; Simon Metson; Vincenzo Miccio; Sanjay Padhi; Haifeng Pi; Hassen Riahi; Daniel Riley; Paul Rossman; Pablo Saiz; Andrea Sartirana; Andrea Sciabà; Vijay Sekhri; Daniele Spiga; Lassi Tuura; Eric Vaandering; Lukas Vanelderen; Petra Van Mulders; Aresh Vedaee; Ilaria Villella; Eric Wicklund; Tony Wildish; Christoph Wissing; Frank Würthwein

2010-01-01

78

Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

NASA Astrophysics Data System (ADS)

This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ? 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (?0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

Pasari, Sumanta; Dikshit, Onkar

2014-07-01

79

NASA Technical Reports Server (NTRS)

This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

1990-01-01

80

[Weibull equation and dissolution kinetics].

On modelling the kinetics of dissolution, two variants of Weibull equation in a two-parameter or tree-parameter form are used, the estimations of their parameters not being quite comparable. The introductory part of the paper sums up four variants of the equation and chracterized their relationship to granulometric Rosin-Rammler-Sperling equation. The kinetics of liberation of pilocarpinium chloride from lyophilized lamellae with graded amounts of hydroxypropylmethylcellulose is expressed by the above-mentioned equations and their parameters are compared. The conclusions generally discusses the usability of equations and their correct interperetation in modelling dissolution of active ingrediens from dosage forms. PMID:15631003

Zatloukal, Z

2004-11-01

81

Cruciform beam fracture mechanics specimensl have been developed in the Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far- field, out-of-plane biaxird bending stress component in the test section that approximates the nonlinear biaxial stresses resulting from pressurized-thernxd-shock or pressure-temperature loading of a nuclear reactor pressure vessel (RPV). Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shtdlow, surface flaws. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. Two and three- parameter Weibull models have been calibrated using a new scheme (developed at the University of Illinois) that maps toughness data from test specimens with distinctly different levels of crack-tip constraint to a small scale yielding (SSY) Weibull stress space. These models, using the new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the OW integral definition, have been shown to correlate the experimentally observed biaxiaI effect in cruciform specimens, thereby providing a scaling mechanism between uniaxial and biaxial loading states.

Bass, B.R.; McAfee, W.J.; Williams, P.T.

1999-08-01

82

Balanced condition in networks leads to Weibull statistics

The importance of the balance in inhibitory and excitatory couplings in the brain has increasingly been realized. Despite the key role played by inhibitory-excitatory couplings in the functioning of brain networks, the impact of a balanced condition on the stability properties of underlying networks remains largely unknown. We investigate properties of the largest eigenvalues of networks having such couplings, and find that they follow completely different statistics when in the balanced situation. Based on numerical simulations, we demonstrate that the transition from Weibull to Fr\\'echet via the Gumbel distribution can be controlled by the variance of the column sum of the adjacency matrix, which depends monotonically on the denseness of the underlying network. As a balanced condition is imposed, the largest real part of the eigenvalue emulates a transition to the generalized extreme value statistics, independent of the inhibitory connection probability. Furthermore, the transition to the Weibull statistics...

Jalan, Sarika

2013-01-01

83

An exploratory model analysis device we call CDF knotting is introduced. It is a technique we have found useful for exploring relationships between points in the parameter space of a model and global properties of associated distribution functions. It can be used to alert the model builder to a condition we call lack of distinguishability which is to nonlinear models

K. T. Wallenius; A. S. Korkotsides

1990-01-01

84

NASA Astrophysics Data System (ADS)

The Weibull distribution between volume and square root of isopach area has been recently introduced for determining volume of tephra deposits, which is crucial to the assessment of the magnitude and hazards of explosive volcanoes. We show how the decay of the size of the largest lithics with the square root of isopleth area can also be well described using a Weibull function and how plume height correlates strongly with corresponding Weibull parameters. Variations of median grain size (Md ?) values with square root of area of the associated contours can be, similarly, well fitted with a Weibull function. Weibull parameters, derived for both the thinning of tephra deposits and the decrease of grain size (both maximum lithic diameter and Md ?), with a proxy for the distance from vent (e.g., square root of isoline areas) can be combined to classify the style of explosive volcanic eruptions. Accounting for the uncertainty in the derivation of eruptive parameters (e.g., plume height and volume of tephra deposits) is crucial to any classification of eruptive style and hazard assessment. Considering a typical uncertainty of 20 % for the determination of plume height, a new eruption classification scheme based on selected Weibull parameters is proposed. Ultraplinian, Plinian, Subplinian, and small-moderate explosive eruptions are defined on the ground of plume height and mass eruption rate. Overall, the Weibull fitting represents a versatile and reliable strategy for the estimation of both the volume of tephra deposits and the height of volcanic plumes and for the classification of eruptive style. Nonetheless, due to the typically large uncertainties (mainly due to availability of data, compilation of isopach and isopleth maps, and discrepancies from empirical best fits), plume height, volume, and magnitude of explosive eruptions cannot be considered as absolute values, regardless of the technique used. It is important that various empirical and analytical methods are applied in order to assess such an uncertainty.

Bonadonna, Costanza; Costa, Antonio

2013-08-01

85

Weibull- k Revisited: "Tall" Profiles and Height Variation of Wind Statistics

NASA Astrophysics Data System (ADS)

The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter ( k), as well as mean wind speed. Towards the aim of improving predictions of the Weibull- profile, we develop expressions for the profile of long-term variance of wind speed, including a method extending our probabilistic wind-profile theory; together these two profiles lead to a profile of Weibull-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85-110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull- k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion of the models' efficacy and applicability. The latter includes a comparative evaluation of Wieringa-type empirical models and perturbed-geostrophic forms with regard to surface-layer behaviour, as well as for heights where climatological wind-speed variability is not dominated by surface effects.

Kelly, Mark; Troen, Ib; Jørgensen, Hans E.

2014-07-01

86

Transmission overhaul and replacement predictions using Weibull and renewal theory

NASA Technical Reports Server (NTRS)

A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

Savage, M.; Lewicki, D. G.

1989-01-01

87

Transmission overhaul and replacement predictions using Weibull and renewel theory

NASA Technical Reports Server (NTRS)

A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

Savage, M.; Lewicki, D. G.

1989-01-01

88

Modeling root reinforcement using a root-failure Weibull survival function

NASA Astrophysics Data System (ADS)

Root networks contribute to slope stability through complex interactions with soil that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamics of root turnover, the quantification of root reinforcement on steep slopes is challenging and consequently the calculation of slope stability also. Although considerable progress has been made, some important aspects of root mechanics remain neglected. In this study we address specifically the role of root-strength variability on the mechanical behavior of a root bundle. Many factors contribute to the variability of root mechanical properties even within a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field data sets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the equations of the tensile force, the elasticity of the roots, and the root distribution are the most important steps. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for tensile, shear and compression behavior allows for the consideration of the stabilization effects of root networks on steep slopes and the influence that this has on the triggering of shallow landslides.

Schwarz, M.; Giadrossich, F.; Cohen, D.

2013-11-01

89

Measuring the Weibull modulus of microscope slides

NASA Technical Reports Server (NTRS)

The objectives are that students will understand why a three-point bending test is used for ceramic specimens, learn how Weibull statistics are used to measure the strength of brittle materials, and appreciate the amount of variation in the strength of brittle materials with low Weibull modulus. They will understand how the modulus of rupture is used to represent the strength of specimens in a three-point bend test. In addition, students will learn that a logarithmic transformation can be used to convert an exponent into the slope of a straight line. The experimental procedures are explained.

Sorensen, Carl D.

1992-01-01

90

Theory of radar detection in coherent Weibull clutter

The problem of detecting radar targets embedded in coherent Weibull clutter is considered. A mathematical procedure is developed that makes it possible to obtain a coherent sequence having a Weibull pdf for the amplitude, a uniform pdf for the phase, and an autocorrelation function, between the successive samples, selected at will. The newly introduced 'coherent Weibull clutter' (CWC) represents a

A. Farina; A. Russo; F. Scannapieco; S. Barbarossa

1987-01-01

91

Incorporating finite element analysis into component life and reliability

NASA Technical Reports Server (NTRS)

A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

August, Richard; Zaretsky, Erwin V.

1991-01-01

92

Linear Bayesian inference for accelerated Weibull model.

In this paper, we present a Bayesian approach for inference from accelerated life tests when the underlying life model is Weibull. Our approach is based on the General Linear Models framework of West, Harrison and Migon (1985). We discuss inference for the model and show that computable results can be obtained using linear Bayesian methods. We illustrate the usefulness of our approach by applying it to some actual data from accelerated life tests. PMID:9384626

Mazzuchi, T A; Soyer, R; Vopatek, A L

1997-01-01

93

Effect of Individual Component Life Distribution on Engine Life Prediction

NASA Technical Reports Server (NTRS)

The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

2003-01-01

94

The ATLAS distributed analysis system

NASA Astrophysics Data System (ADS)

In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

Legger, F.; Atlas Collaboration

2014-06-01

95

Weibull: a regression model for survival time studies.

WEIBULL is a FORTRAN program which calculates the regression coefficients, their standard errors and the maximised log-likelihood of the data for a Weibull regression model. It incorporates routines from the NAG library in order to perform the maximisation. A kappa 2 statistic is computed to check the adequacy of the fit. PMID:7460541

O'Quigley, J; Roberts, A

1980-08-01

96

An extension of Gompertzian growth dynamics: Weibull and Frechet models.

In this work a new probabilistic and dynamical approach to an extension of the Gompertz law is proposed. A generalized family of probability density functions, designated by Beta • (p,q), which is proportional to the right hand side of the Tsoularis-Wallace model, is studied. In particular, for p=2, the investigation is extended to the extreme value models of Weibull and Frechet type. These models, described by differential equations, are proportional to the hyper-Gompertz growth model. It is proved that the Beta• (2,q) densities are a power of betas mixture, and that its dynamics are determined by a non-linear coupling of probabilities. The dynamical analysis is performed using techniques of symbolic dynamics and the system complexity is measured using topological entropy. Generally, the natural history of a malignant tumour is reflected through bifurcation diagrams, in which are identified regions of regression, stability, bifurcation, chaos and terminus. PMID:23458306

Rocha, J Leonel; Aleixo, Sandra M

2013-04-01

97

Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

NASA Technical Reports Server (NTRS)

Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

Bavuso, Salvatore J.

1998-01-01

98

Distributed data analysis in ATLAS

NASA Astrophysics Data System (ADS)

Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

Nilsson, Paul; Atlas Collaboration

2012-12-01

99

The vertical distributions of the leaf area and inclination angle were investigated for a 46-year-old stand of Japanese cypress (Chamaecyparis obtusa Endl.) in central Japan. The vertical distribution of leaf area per tree was measured by destructive sampling (n=9) and fitted to a modified Weibull cumulative distribution function (Weibull CDF). The two model parameters of Weibull CDF were derived from

Hajime Utsugi; Masatake Araki; Tatsuroo Kawasaki; Moriyoshi Ishizuka

2006-01-01

100

Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

NASA Technical Reports Server (NTRS)

Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

Holland, Frederic A., Jr.; Zaretsky, Erwin V.

1991-01-01

101

The non-isothermal pyrolysis kinetics of Acetocell (the organosolv) and Lignoboost® (kraft) lignins, in an inert atmosphere, have been studied by thermogravimetric analysis. Using isoconversional analysis, it was concluded that the apparent activation energy for all lignins strongly depends on conversion, showing that the pyrolysis of lignins is not a single chemical process. It was identified that the pyrolysis process of Acetocell and Lignoboost® lignin takes place over three reaction steps, which was confirmed by appearance of the corresponding isokinetic relationships (IKR). It was found that major pyrolysis stage of both lignins is characterized by stilbene pyrolysis reactions, which were subsequently followed by decomposition reactions of products derived from the stilbene pyrolytic process. It was concluded that non-isothermal pyrolysis of Acetocell and Lignoboost® lignins can be best described by n-th (n>1) reaction order kinetics, using the Weibull mixture model (as distributed reactivity model) with alternating shape parameters. PMID:21852115

Jankovi?, Bojan

2011-10-01

102

To evaluate the durability of machinable dental restorative materials, this study performed an experiment to evaluate the flexural strength and Weibull statistics of a machinable lithium disilicate glass-ceramic and a machinable composite resin after being thermocycled for certain cycles. A total of 40 bar-shape specimens of were prepared with the dimension of 20 mm × 4 mm × 2 mm, which were divided into four groups of 10 specimens. Ten specimens of machinable lithium disilicate glass-ceramic (IPS e.max CAD, Ivoclar Vivadent, Liechtenstein) and 10 specimens of machinable composite resin (Paradigm MZ 100, 3M ESPE, USA) were subjected to 3-point flexural strength test. Other 10 specimens of each material were thermocycled between water temperature of 5 and 55 °C for 10,000 cycles. After that, they were tested using 3-point flexural strength test. Statistical analysis was performed using two-way analysis of variance and Tukey multiple comparisons. Weibull analysis was performed to evaluate the reliability of the strength. Means of strength and their standard deviation were: thermocycled IPS e.max CAD 389.10 (50.75), non-thermocycled IPS e.max CAD 349.96 (38.34), thermocycled Paradigm MZ 100 157.51 (12.85), non-thermocycled Paradigm MZ 100 153.33 (19.97). Within each material group, there was no significant difference in flexural strength between thermocycled and non-thermocycled specimens. Considering the Weibull analysis, there was no statistical difference of Weibull modulus in all experimental groups. Within the limitation of this study, the results showed that there was no significant effect of themocycling on flexural strength and Weibull modulus of a machinable glass-ceramic and a machinable composite resin. PMID:25489161

Peampring, Chaimongkon; Sanohkan, Sasiwimol

2014-12-01

103

Statistical modeling of tornado intensity distributions

NASA Astrophysics Data System (ADS)

We address the issue to determine an appropriate general functional shape of observed tornado intensity distributions. Recently, it was suggested that in the limit of long and large tornado records, exponential distributions over all positive Fujita or TORRO scale classes would result. Yet, our analysis shows that even for large databases observations contradict the validity of exponential distributions for weak (F0) and violent (F5) tornadoes. We show that observed tornado intensities can be much better described by Weibull distributions, for which an exponential remains a special case. Weibull fits in either v or F scale reproduce the observations significantly better than exponentials. In addition, we suggest to apply the original definition of negative intensity scales down to F-2 and T-4 (corresponding to v=0 m s -1) at least for climatological analyses. Weibull distributions allow for an improved risk assessment of violent tornadoes up to F6, and better estimates of total tornado occurrence, degree of underreporting and existence of subcritical tornadic circulations below damaging intensity. Therefore, our results are relevant for climatologists and risk assessment managers alike.

Dotzek, Nikolai; Grieser, Jürgen; Brooks, Harold E.

104

Distributed computing and nuclear reactor analysis

Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

1994-03-01

105

Data flow analysis of distributed communicating processes

Data flow analysis is a technique essential to the compile-time optimization of computer programs, wherein facts relevant to program optimizations are discovered by the global propagation of facts obvious locally. This paper extends several known techniques for data flow analysis of sequential programs to the static analysis of distributed communicating processes. In particular, we present iterative algorithms for detecting unreachable

John H. Reif; Scott A. Smolka

1990-01-01

106

Probabilistic Weibull behavior and mechanical properties of MEMS brittle materials

The objective of this work is to present a brief overview of a probabilistic design methodology for brittle structures, review the literature for evidence of probabilistic behavior in the mechanical properties of MEMS (especially strength), and to investigate whether evidence exists that a probabilistic Weibull effect exists at the structural microscale. Since many MEMS devices are fabricated from brittle materials,

O. M. Jadaan; N. N. Nemeth; J. Bagdahn; W. N. Sharpe

2003-01-01

107

Weibull Prediction Intervals for a Future Number of Failures

Weibull Prediction Intervals for a Future Number of Failures Daniel J. Nordman Dept. of Statistics prediction intervals for the number of failures that will be observed in a future inspec- tion of a sample-based prediction intervals perform better than the alternatives. Key Words: Coverage probability, Prediction bounds

108

Weibull Prediction Intervals for a Future Number of Failures

Weibull Prediction Intervals for a Future Number of Failures Daniel J. Nordman Dept. of Statistics prediction intervals for the number of failures that will be observed in a future inspecÂ tion of a sampleÂbased prediction intervals perform better than the alternatives. Key Words: Coverage probability, Prediction bounds

109

Absolute Continuous Bivariate Generalized Exponential Distribution

Absolute Continuous Bivariate Generalized Exponential Distribution Debasis Kundu and Rameshwar D. Gupta Abstract Generalized exponential distribution has been used quite effectively to model posi- tively skewed lifetime data as an alternative to the well known Weibull or gamma distributions

Kundu, Debasis

110

Distributed Checkpointing: Analysis and Benchmarks

This work proposes a metric for the analysis and benchmarkingof checkpointing algorithms through simulation; the resultsobtained show that the metric is a good checkpoint overhead indicator. The metr ic is implemented by ChkSim, a simulator that has been used to compare 18 quasi-s ynchronous checkpointing algorithms. A survey of previous analyses of checkpointing shows our study to be the most

Gustavo M. D. Vieira; Luiz E. Buzato

111

The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems. PMID:25321286

Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang

2014-08-25

112

Effects of cyclic stress distribution models on fatigue life predictions

NASA Astrophysics Data System (ADS)

The fatigue analysis of a wind turbine component typically uses representative samples of cyclic loads to determine lifetime loads. In this paper, several techniques currently in use are compared to one another based on fatigue life analyses. The generalized Weibull fitting technique is used to remove the artificial truncation of large-amplitude cycles that is inherent in relatively short data sets. Using data from the Sandia/DOE 34-m Test Bed, the generalized Weibull file technique is shown to be excellent for matching the body of the distribution of cyclic loads and for extrapolating the tail of the distribution. However, the data also illustrate that the fitting technique is not a substitute for an adequate data base.

Sutherland, H. J.; Veers, P. S.

1994-10-01

113

DIAL: Distributed Interactive Analysis of Large Datasets

DIAL will enable users to analyze very large, event-based datasets using an application that is natural to the data format. Both the dataset and the processing may be distributed over a farm, a site (collection of farms) or a grid (collection of sites). Here we describe the goals of the project, the current design and implementation, and plans for future development. DIAL is being developed within PPDG to understand the requirements that interactive analysis places on the grid and within ATLAS to enable distributed interactive analysis of event data.

D. L. Adams

2003-05-08

114

Weibull Effective Area for Hertzian Ring Crack Initiation Stress

Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.

Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL

2011-01-01

115

Weibull modulus and fracture strength of highly porous hydroxyapatite.

Porous hydroxyapatite (HA) is used in a variety of applications including biomedical materials such as engineered bone materials and microbe filters. Despite the utility of the Weibull modulus, m, as a gauge of the mechanical reliability of brittle solids, there have been very few studies of m for porous HA. A recent study of porous HA that included the current authors (Fan, X., Case, E.D., Ren, F., Shu, Y., Baumann, M.J., 2012a. Journal of the Mechanical Behavior of Biomedical Materials. 8, 21-36) showed increases in m for porosity, P, approaching PG, the porosity of the green (unfired) specimen. In this paper, 18 groups of highly porous HA specimens (12 groups fabricated in this study and 6 groups from Fan et al., 2012a) were analyzed with P values from 0.59 to 0.62, where PG=0.62. The partially sintered HA specimens were fractured in biaxial flexure using a ring-on-ring test fixture. The fracture strength decreased monotonically with decreasing sintering temperature, Tsinter, from 4.8MPa for specimens sintered at 1025°C-0.66MPa for specimens sintered at 350°C. However, the Weibull modulus remained surprisingly high, ranging from 6.6 to 15.5. In comparison, for HA specimens with intermediate values of P, from about 0.1-0.55, the Weibull modulus tended to be lower (ranging from about 4 to 11) than the highly porous specimens included in this study. PMID:23478051

Fan, X; Case, E D; Gheorghita, I; Baumann, M J

2013-04-01

116

CRAB: Distributed analysis tool for CMS

NASA Astrophysics Data System (ADS)

CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

Sala, Leonardo; CMS Collaboration

2012-12-01

117

Assessing a Tornado Climatology from Global Tornado Intensity Distributions

Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if

Bernold Feuerstein; Nikolai Dotzek; Jürgen Grieser

2005-01-01

118

Performance optimisations for distributed analysis in ALICE

NASA Astrophysics Data System (ADS)

Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

2014-06-01

119

Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

NASA Technical Reports Server (NTRS)

Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

2012-01-01

120

Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

NASA Technical Reports Server (NTRS)

Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

2008-01-01

121

Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

NASA Technical Reports Server (NTRS)

This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

1990-01-01

122

Data Distribution Analysis and Optimization for PointerBased Distributed Programs

Data Distribution Analysis and Optimization for PointerÂBased Distributed Programs Jenq Kuen Lee Abstract High Performance Fortran (HPF) provides distributed arrays to efficiently support a global name space on distributed memory architectures. The distributed data structures supported by HPF, howÂ ever

Lee, Jenq-Kuen

123

Distribution System Analysis Tools for Studying High Penetration of PV

Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV project titled "Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support

124

Analysis and control of distributed cooperative systems.

As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

2004-09-01

125

Time-dependent reliability analysis of ceramic engine components

The computer program CARES\\/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and\\/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES\\/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution

Noel N. Nemeth

1993-01-01

126

Nonparametric Distribution Analysis for Text Mining

NASA Astrophysics Data System (ADS)

A number of new algorithms for nonparametric distribution analysis based on Maximum Mean Discrepancy measures have been recently introduced. These novel algorithms operate in Hilbert space and can be used for nonparametric two-sample tests. Coupled with recent advances in string kernels, these methods extend the scope of kernel-based methods in the area of text mining. We review these kernel-based two-sample tests focusing on text mining where we will propose novel applications and present an efficient implementation in the kernlab package. We also present an efficient and integrated environment for applying modern machine learning methods to complex text mining problems through the combined use of the tm (for text mining) and the kernlab (for kernel-based learning) R packages.

Karatzoglou, Alexandros; Feinerer, Ingo; Hornik, Kurt

127

Analysis on voltage distribution performance of HVDC thyristor valves

Analysis and simulation on voltage distribution of HVDC thyristor valves under a variety of voltage stress are performed respectively. The summarized influencing factors on voltage distribution indicate that they are different with frequency. Due to the importance and difficulty in approaching even distribution in high frequency, the voltage distribution with different module structure and different power circuit parameters is also

H. Guo; G. F. Tang; J. L. Wen; K. P. Zha; X. G. Wei

2010-01-01

128

Two-Component Extreme Value Distribution for Flood Frequency Analysis

Theoretical considerations, supported by statistical analysis of 39 annual flood series (AFS) of Italian basins, suggest that the two-component extreme value (TCEV) distribution can be assumed as a parent flood distribution, i.e., one closely representative of the real flood experience. This distribution belongs to the family of distributions of the annual maximum of a compound Poisson process, which is a

Fabio Rossi; Mauro Fiorentino; Pasquale Versace

1984-01-01

129

Group reaction time distributions and an analysis of distribution statistics

Describes a method of obtaining an average reaction time (RT) distribution for a group of Ss. The method is particularly useful for cases in which data from many Ss are available but there are only 10–20 RT observations per S cell. Essentially, RTs for each S are organized in ascending order, and quantiles are calculated. The quantiles are then averaged

Roger Ratcliff

1979-01-01

130

Group Sequential Design for Randomized Phase III Trials under the Weibull Model.

Abstract In this paper, a parametric sequential test is proposed under the Weibull model. The proposed test is asymptotically normal with an independent increments structure. The sample size for fixed sample test is derived for the purpose of group sequential trial design. In addition, a multi-stage group sequential procedure is given under the Weibull model by applying the Brownian motion property of the test statistic and sequential conditional probability ratio test methodology. PMID:25322440

Wu, Jianrong; Xiong, Xiaoping

2014-10-16

131

Non resonant transmission modelling with Statistical modal Energy distribution Analysis

1 Non resonant transmission modelling with Statistical modal Energy distribution Analysis L. Maxit Capelle, F-69621 Villeurbanne Cedex, France Statistical modal Energy distribution Analysis (SmEdA) can be used as an alternative to Statistical Energy Analysis for describing subsystems with low modal overlap

Boyer, Edmond

132

Analysis of measured airfoil pressure distributions

NASA Technical Reports Server (NTRS)

A method for evaluating the Glauert coefficients from airfoil pressure distributions is investigated. The linear operating range of the airfoils in steady-state and periodic operating conditions are considered. A rational method for quantitatively characterizing airfoil pressure distributions relative to their geometry and aerodynamic operating environment is developed. The characteristics of the airfoil operating environment is determined from its measured pressure distribution.

Piziali, R. A.

1975-01-01

133

NASA Astrophysics Data System (ADS)

A method is described which enables one to find from an experimentally determined number or mass drop size distribution the 'best' fit to standard distribution functions such as Rosin-Rammler, log-normal, root-normal, Nukiyama-Tanasawa, upper limit, as well as various means and parameters of these distributions, together with their variances. An example analysis of the drop size distribution of a spray is presented.

Bayvel, L. P.

134

As data-rich medical datasets are becoming routinely collected, there is a growing demand for regression methodology that facilitates variable selection over a large number of predictors. Bayesian variable selection algorithms offer an attractive solution, whereby a sparsity inducing prior allows inclusion of sets of predictors simultaneously, leading to adjusted effect estimates and inference of which covariates are most important. We present a new implementation of Bayesian variable selection, based on a Reversible Jump MCMC algorithm, for survival analysis under the Weibull regression model. A realistic simulation study is presented comparing against an alternative LASSO-based variable selection strategy in datasets of up to 20,000 covariates. Across half the scenarios, our new method achieved identical sensitivity and specificity to the LASSO strategy, and a marginal improvement otherwise. Runtimes were comparable for both approaches, taking approximately a day for 20,000 covariates. Subsequently, we present a real data application in which 119 protein-based markers are explored for association with breast cancer survival in a case cohort of 2287 patients with oestrogen receptor-positive disease. Evidence was found for three independent prognostic tumour markers of survival, one of which is novel. Our new approach demonstrated the best specificity. PMID:25193065

Newcombe, Pj; Raza Ali, H; Blows, Fm; Provenzano, E; Pharoah, Pd; Caldas, C; Richardson, S

2014-09-01

135

Effect of covariate omission in Weibull accelerated failure time model: a caution.

The accelerated failure time model is presented as an alternative to the proportional hazard model in the analysis of survival data. We investigate the effect of covariates omission in the case of applying a Weibull accelerated failure time model. In an uncensored setting, the asymptotic bias of the treatment effect is theoretically zero when important covariates are omitted; however, the asymptotic variance estimator of the treatment effect could be biased and then the size of the Wald test for the treatment effect is likely to exceed the nominal level. In some cases, the test size could be more than twice the nominal level. In a simulation study, in both censored and uncensored settings, Type I error for the test of the treatment effect was likely inflated when the prognostic covariates are omitted. This work remarks the careless use of the accelerated failure time model. We recommend the use of the robust sandwich variance estimator in order to avoid the inflation of the Type I error in the accelerated failure time model, although the robust variance is not commonly used in the survival data analyses. PMID:24895154

Gosho, Masahiko; Maruo, Kazushi; Sato, Yasunori

2014-11-01

136

NASA Technical Reports Server (NTRS)

The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

Gyekenyesi, John P.; Nemeth, Noel N.

1987-01-01

137

Stochastic ordering of extreme value distributions

We investigate the second-order stochastic ordering of the extreme value distributions within each of three families: the Gumbel distribution, the Frechet distribution, and the Weibull distribution. We give conditions for second-order stochastic dominance, conditional second-order stochastic dominance, and order statistics second-order stochastic dominance within the three families

Ganghuai Wang; James H. Lambert; Yacov Y. Haimes

1999-01-01

138

Time-dependent reliability analysis of ceramic engine components

NASA Technical Reports Server (NTRS)

The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

Nemeth, Noel N.

1993-01-01

139

Time-dependent reliability analysis of ceramic engine components

The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

Nemeth, N.N.

1993-10-01

140

Harmonic analysis of electrical distribution systems

This report presents data pertaining to research on harmonics of electric power distribution systems. Harmonic data is presented on RMS and average measurements for determination of harmonics in buildings; fluorescent ballast; variable frequency drive; georator geosine harmonic data; uninterruptible power supply; delta-wye transformer; westinghouse suresine; liebert datawave; and active injection mode filter data.

NONE

1996-03-01

141

Economic analysis of efficient distribution transformer trends

This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

1998-03-01

142

A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

ERIC Educational Resources Information Center

Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

Ritter, Nicola L.

2012-01-01

143

Karst database implementation in Minnesota: analysis of sinkhole distribution

This paper presents the overall sinkhole distributions and conducts hypothesis tests of sinkhole distributions and sinkhole formation using data stored in the Karst Feature Database (KFD) of Minnesota. Nearest neighbor analysis (NNA) was extended to include different orders of NNA, different scales of concentrated zones of sinkholes, and directions to the nearest sinkholes. The statistical results, along with the sinkhole

Y. Gao; E. C. Alexander Jr; R. J. Barnes

2005-01-01

144

Distributed Antenna System: Performance Analysis in Multi-user Scenario

Distributed Antenna System: Performance Analysis in Multi-user Scenario Lin Dai Dept. Electronic--This paper provides a comparative study of the distributed antenna system (DAS) and the conventional co- located antenna system (CAS) in the multi-user scenario. It is demonstrated that thanks to the decrease

Dai, Lin

145

Silk Fiber Mechanics from Multiscale Force Distribution Analysis Murat Cetinkaya,

Silk Fiber Mechanics from Multiscale Force Distribution Analysis Murat Cetinkaya, Senbo Xiao, Bernd the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular

GrÃ¤ter, Frauke

146

A Low Cost Distributed System for FEM Parallel Structural Analysis

\\u000a In this paper, a distributed computational system for finite element structural analysis and some strategies for improving\\u000a its efficiency are described. The system consists of a set of programs that performs the structural analysis in a distributed\\u000a computer network environment. This set is composed by a pre-processor, a post-processor, a program responsible for partitioning\\u000a the model in substructures, and by

Célio Oda Moretti; Túlio Nogueira Bittencourt; Luiz Fernando Martha

1998-01-01

147

Performance analysis of static locking in replicated distributed database systems

NASA Technical Reports Server (NTRS)

Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

Kuang, Yinghong; Mukkamala, Ravi

1991-01-01

148

Stability and Dissipativity Analysis of Distributed Delay Cellular Neural Networks

In this brief, the problems of delay-dependent stability analysis and strict (Q,S,R)-?-dissipativity analysis are investigated for cellular neural networks (CNNs) with distributed delay. First, by introducing an integral partitioning technique, two new forms of Lyapunov-Krasovskii functionals are con- structed, and improved distributed delay-dependent stability conditions are established in terms of linear matrix inequalities. Based on this criterion, a new sufficient

Zhiguang Feng; James Lam

2011-01-01

149

LARGE-SCALE NORMAL COORDINATE ANALYSIS ON DISTRIBUTED

LARGE-SCALE NORMAL COORDINATE ANALYSIS ON DISTRIBUTED MEMORY PARALLEL SYSTEMS Chao Yang1 Padma substitution phase of the computation. 1 Introduction Normal Coordinate Analysis (NCA) (Wilson et al. 1955 the dynamic role protein vibration plays in the photosynthetic center of green plants (Renger 1998). The long

Raghavan, Padma

150

Performance Analysis for Teraflop Computers: A Distributed Automatic Approach

Performance analysis for applications on teraflop computers requires a new combination of concepts: online processing, automation, and distribution. The article presents the design of a new analysis system that performs an automatic search for performance problems. This search is guided by a specification of performance properties based on the APART Specification Language. The system is being implemented as a network

Michael Gerndt; Andreas Schmidt; Martin Schulz; Roland Wismüller

2002-01-01

151

Grammatical analysis as a distributed neurobiological function.

Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences-inflectionally complex words and minimal phrases-and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. Hum Brain Mapp 36:1190-1201, 2015. © 2014 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:25421880

Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

2015-03-01

152

EXPERIMENTAL DESIGN STRATEGY FOR THE WEIBULL DOSE RESPONSE MODEL (JOURNAL VERSION)

The objective of the research was to determine optimum design point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose response relationship follows the Weibull. The optimum design is dependent on the values of the We...

153

Strength statistics and the distribution of earthquake interevent times

NASA Astrophysics Data System (ADS)

The Weibull distribution is often used to model the earthquake interevent times distribution (ITD). We propose a link between the earthquake ITD on single faults with the Earth’s crustal shear strength distribution by means of a phenomenological stick-slip model. For single faults or fault systems with homogeneous strength statistics and power-law stress accumulation we obtain the Weibull ITD. We prove that the moduli of the interevent times and crustal shear strength are linearly related, while the time scale is an algebraic function of the scale of crustal shear strength. We also show that logarithmic stress accumulation leads to the log-Weibull ITD. We investigate deviations of the ITD tails from the Weibull model due to sampling bias, magnitude cutoff thresholds, and non-homogeneous strength parameters. Assuming the Gutenberg-Richter law and independence of the Weibull modulus on the magnitude threshold, we deduce that the interevent time scale drops exponentially with the magnitude threshold. We demonstrate that a microearthquake sequence from the island of Crete and a seismic sequence from Southern California conform reasonably well to the Weibull model.

Hristopulos, Dionissios T.; Mouslopoulou, Vasiliki

2013-02-01

154

First Experiences with LHC Grid Computing and Distributed Analysis

In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

Fisk, Ian

2010-12-01

155

Probabilistic analysis of the distributed power generation in weakly meshed distribution systems

This paper presents an approach for probabilistic analysis of unbalanced three-phase weakly meshed distribution systems considering uncertainty in load demand. In order to achieve high computational efficiency this approach uses both an efficient method for probabilistic analysis and a radial power flow. The probabilistic approach used is the well-known Two-Point Estimate Method. Meanwhile, the compensation-based radial power flow is used

C. A. Peñuela; G. E. Mauricio; J. R. S. Mantovani

2010-01-01

156

ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS

Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).

Tuffner, Francis K.; Singh, Ruchi

2011-08-09

157

Energy loss analysis of an integrated space power distribution system

NASA Technical Reports Server (NTRS)

The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

Kankam, M. David; Ribeiro, P. F.

1992-01-01

158

Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?

NASA Technical Reports Server (NTRS)

We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.

Kanefsky, B.; Barlow, N. G.; Gulick, V. C.

2001-01-01

159

Pointers and linked lists in electric power distribution circuit analysis

Electric power distribution circuit analysis programs must efficiently manage a large quantity of system and equipment data. Utility engineers now wish to use integrated software packages with several functions that work efficiently and share data. The use of data structures stored in linked lists and processed through pointers is described. The pointers and linked lists compact the data storage and

R. P. Broadwater; J. C. Thompson; T. E. McDermott

1991-01-01

160

Principal Component Analysis for Distributed Data Sets with Updating

Principal Component Analysis for Distributed Data Sets with Updating Zheng-Jian Bai1, , Raymond H data sets is a key requirement in data mining. A powerful technique for this purpose is the principal component analy- sis (PCA). PCA-based clustering algorithms are effective when the data sets are found

Chan, Raymond

161

A Requirement Analysis for High Performance Distributed Computing over LAN's

of the computing power is sizable and if har- nessed can provide a cost-effective alternative to ex- pensiveA Requirement Analysis for High Performance Distributed Computing over LAN's Manish Parashar workstations is comparable to supercomputers. Eowever a number of obstacles have to be overcome before the full

Parashar, Manish

162

Fréchet sensitivity analysis for partial differential equations with distributed parameters

This paper reviews Fr´ echet sensitivity analysis for partial differential equations with variations in distributed parameters. The Fr´ echet derivative provides a linear map be- tween parametric variations and the linearized response of the solution. We propose a methodology based on representations of the Frderivative operator to find those variations that lead to the largest changes to the solution (the

Jeff Borggaard; Vitor Leite Nunes

2011-01-01

163

Rapid Analysis of Mass Distribution of Radiation Shielding

NASA Technical Reports Server (NTRS)

Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

Zapp, Edward

2007-01-01

164

Analyzing Distributed Functions in an Integrated Hazard Analysis

NASA Technical Reports Server (NTRS)

Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

Morris, A. Terry; Massie, Michael J.

2010-01-01

165

Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

NASA Astrophysics Data System (ADS)

In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

2014-01-01

166

The truncated mean of an asymmetric distribution

This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point,

A. Marazzi; C. Ruffieux

1999-01-01

167

Adaptive walks and distribution of beneficial fitness effects.

We study the adaptation dynamics of a maladapted asexual population on rugged fitness landscapes with many local fitness peaks. The distribution of beneficial fitness effects is assumed to belong to one of the three extreme value domains, viz. Weibull, Gumbel, and Fréchet. We work in the strong selection-weak mutation regime in which beneficial mutations fix sequentially, and the population performs an uphill walk on the fitness landscape until a local fitness peak is reached. A striking prediction of our analysis is that the fitness difference between successive steps follows a pattern of diminishing returns in the Weibull domain and accelerating returns in the Fréchet domain, as the initial fitness of the population is increased. These trends are found to be robust with respect to fitness correlations. We believe that this result can be exploited in experiments to determine the extreme value domain of the distribution of beneficial fitness effects. Our work here differs significantly from the previous ones that assume the selection coefficient to be small. On taking large effect mutations into account, we find that the length of the walk shows different qualitative trends from those derived using small selection coefficient approximation. PMID:24274696

Seetharaman, Sarada; Jain, Kavita

2014-04-01

168

Distributed signal analysis of free-floating paraboloidal membrane shells

NASA Astrophysics Data System (ADS)

Multifarious thin paraboloidal shell structures with unique geometric characteristics are utilized in aerospace, telecommunication and other engineering applications over the years. Governing equations of motion of paraboloidal shells are complicated and closed-form analytical solutions of these partial differential equations (PDEs) are difficult to derive. Furthermore, distributed monitoring technique and its resulting global sensing signals of thin flexible membrane shells are not well understood. This study focuses on spatially distributed modal sensing characteristics of free-floating flexible paraboloidal membrane shells laminated with distributed sensor patches based on a new set of assumed mode shape functions. In order to evaluate overall sensing/control effects, microscopic sensing signal characteristic, sensor segmentation and location of distributed sensors on thin paraboloidal membrane shells with different curvatures are investigated. Parametric analysis suggests that the signal generation depends on modal membrane strains in the meridional and circumferential directions in which the latter is more significant than the former, while all bending strains vanish in membrane shells. This study (1) demonstrates an analysis method for distributed sensors laminated on lightweight paraboloidal flexible structures and (2) identifies critical components and regions that generate significant signals for various shell modes.

Yue, H. H.; Deng, Z. Q.; Tzou, H. S.

2007-07-01

169

Parametrizing the local dark matter speed distribution: A detailed analysis

NASA Astrophysics Data System (ADS)

In a recent paper, a new parametrization for the dark matter (DM) speed distribution f(v) was proposed for use in the analysis of data from direct detection experiments. This parametrization involves expressing the logarithm of the speed distribution as a polynomial in the speed v. We present here a more detailed analysis of the properties of this parametrization. We show that the method leads to statistically unbiased mass reconstructions and exact coverage of credible intervals. The method performs well over a wide range of DM masses, even when finite energy resolution and backgrounds are taken into account. We also show how to select the appropriate number of basis functions for the parametrization. Finally, we look at how the speed distribution itself can be reconstructed, and how the method can be used to determine if the data are consistent with some test distribution. In summary, we show that this parametrization performs consistently well over a wide range of input parameters and over large numbers of statistical ensembles and can therefore reliably be used to reconstruct both the DM mass and speed distribution from direct detection data.

Kavanagh, Bradley J.

2014-04-01

170

False-Alarm Regulation in Log-Normal and Weibull Clutter

Automatic detection radars require some method of adapting to variations in the background clutter in order to control their false-alarm rate. Conventional cell-averaging techniques designed to maintain a constant false-alarm rate in Rayleigh clutter will fail to control the false-alarm rate in more severe clutter environments such as log-normal or Weibull clutter. A processor is described which is capable of

G. B. Goldstein

1973-01-01

171

Spatial Distribution Analysis of Scrub Typhus in Korea

Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does not directly affect the incidence rate. Conclusion: GIS analysis shows the spatial characteristics of scrub typhus. This research can be used to construct a spatial-temporal model to understand the epidemic tsutsugamushi. PMID:24159523

Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

2013-01-01

172

Analysis of neurotransmitter receptor distribution patterns in the cerebral cortex.

The concentration of transmitter receptors varies between different locations in the human cerebral cortex, but also between the different cortical layers within the same area. Analyzing the regional differences in the laminar distribution patterns of various neurotransmitter receptor binding sites by means of quantitative receptor autoradiography may thus provide a functionally relevant insight into the organization of the cortex. Here we introduce an approach to the analysis of in vitro receptor autoradiographic data, providing a framework for the assessment of differences in both mean concentration and laminar distribution patterns across multiple subjects. Initially, laminar receptor distribution patterns for cortical areas are quantified by sampling density profiles in a series of regions of interest (ROIs) from digitalized autoradiographs and computing a mean profile per ROI. These ROI mean profiles are then corrected for distortions in the laminar pattern introduced by cortical folding and averaged to yield a mean profile per area, receptor and hemisphere. Differences in mean binding site concentration between areas are quantified by the asymmetry coefficient which is the difference of the mean concentrations divided by their sum. To quantify differences in laminar receptor distribution patterns between areas, the effects of absolute binding site concentration are first removed by dividing each profile by its mean value. Differences in the laminar pattern were then quantified by calculating the Euclidean distance between these mean corrected profiles. For single subject analysis, we propose a permutation test, comparing the differences between the mean profiles for two areas to differences between groups of profiles randomly assembled from all ROIs sampled in either area. Group inference can then be based on a between-subject conjunction analysis after correcting p-values to control for multiple comparisons. The feasibility of the presented approach is demonstrated by an exemplary analysis of the neurochemical differences between the ventral parts of the second and the third visual area. PMID:17182260

Eickhoff, Simon B; Schleicher, Axel; Scheperjans, Filip; Palomero-Gallagher, Nicola; Zilles, Karl

2007-02-15

173

Electrical Power Distribution and Control Modeling and Analysis

NASA Technical Reports Server (NTRS)

This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

2001-01-01

174

Dynamical analysis of Cohen Grossberg neural networks with distributed delays

NASA Astrophysics Data System (ADS)

In this Letter, a model describing dynamics of Cohen Grossberg neural networks with distributed delays is considered. Without assuming Lipschitz conditions on activation functions, by employing Brouwer's fixed point theorem and applying inequality technique, some new sufficient conditions on the existence, uniqueness and exponential stability of equilibrium point are obtained. Finally, two examples with their numerical simulations are provided to show the correctness of our analysis.

Mao, Zisen; Zhao, Hongyong

2007-04-01

175

Automatic analysis of attack data from distributed honeypot network

NASA Astrophysics Data System (ADS)

There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

2013-05-01

176

Distribution System Reliability Analysis for Smart Grid Applications

NASA Astrophysics Data System (ADS)

Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

Aljohani, Tawfiq Masad

177

Probabilistic Life and Reliability Analysis of Model Gas Turbine Disk

NASA Technical Reports Server (NTRS)

In 1939, W. Weibull developed what is now commonly known as the "Weibull Distribution Function" primarily to determine the cumulative strength distribution of small sample sizes of elemental fracture specimens. In 1947, G. Lundberg and A. Palmgren, using the Weibull Distribution Function developed a probabilistic lifing protocol for ball and roller bearings. In 1987, E. V. Zaretsky using the Weibull Distribution Function modified the Lundberg and Palmgren approach to life prediction. His method incorporates the results of coupon fatigue testing to compute the life of elemental stress volumes of a complex machine element to predict system life and reliability. This paper examines the Zaretsky method to determine the probabilistic life and reliability of a model gas turbine disk using experimental data from coupon specimens. The predicted results are compared to experimental disk endurance data.

Holland, Frederic A.; Melis, Matthew E.; Zaretsky, Erwin V.

2002-01-01

178

Area and Flux Distributions of Active Regions, Sunspot Groups, and Sunspots: A Multi-Database Study

In this work we take advantage of eleven different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions -- where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) $10^{21}$Mx ($10^{22}$Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behaviour of a power-law distribution (when extended into smaller fluxes), making our results compatible with the results of Parnell et al.\\ (200...

Muñoz-Jaramillo, Andrés; Windmueller, John C; Amouzou, Ernest C; Longcope, Dana W; Tlatov, Andrey G; Nagovitsyn, Yury A; Pevtsov, Alexei A; Chapman, Gary A; Cookson, Angela M; Yeates, Anthony R; Watson, Fraser T; Balmaceda, Laura A; DeLuca, Edward E; Martens, Petrus C H

2014-01-01

179

GPS FOM Chimney Analysis using Generalized Extreme Value Distribution

NASA Technical Reports Server (NTRS)

Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.

Ott, Rick; Frisbee, Joe; Saha, Kanan

2004-01-01

180

A comprehensive study of distribution laws for the fragments of Ko\\v{s}ice meteorite

In this study, we conduct a detailed analysis of the Ko\\v{s}ice meteorite fall (February 28, 2010), in order to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Ko\\v{s}ice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady and bimodal lognormal distributions are found to be the most appropriate for describing the Ko\\v{s}ice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Ko\\v{s}ice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 kg and 9 kg respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the...

Gritsevich, Maria; Kohout, Tomáš; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

2014-01-01

181

A comprehensive study of distribution laws for the fragments of Košice meteorite

NASA Astrophysics Data System (ADS)

In this study, we conduct a detailed analysis of the Košice meteorite fall (February 28, 2010), to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Košice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady, and bimodal lognormal distributions are found to be the most appropriate for describing the Košice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential, and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Košice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 and 9 kg, respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the two heaviest pieces of 2.374 kg and 2.167 kg with the mean around 140 g. Based on our investigations, we conclude that two to three larger fragments of 500-1000 g each should exist, but were either not recovered or not reported by illegal meteorite hunters.

Gritsevich, Maria; Vinnikov, Vladimir; Kohout, TomáÅ.¡; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

2014-03-01

182

Thermographic Analysis of Stress Distribution in Welded Joints

NASA Astrophysics Data System (ADS)

The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

Pirši?, T.; Krstulovi? Opara, L.; Domazet, Ž.

2010-06-01

183

Statistical Extreme Value Analysis Methodological improvements

-tailed Frechet distribution ( > 0) Asymptotic theory says that the distribution of block maxima (or minima Statistical extreme value theory The Generalized Extreme Value (GEV) distribution: F(x) = exp - 1 + x - Âµ -1 the bounded Weibull distribution ( distribution ( = 0), and heavy

Paciorek, Chris

184

Analysis of advertising material distributed through pharmacies and drugstores.

A documental analysis was conducted to evaluate advertising material distributed through pharmacies and drugstores according to their compliance with Resolutions n. 102/2000 and 96/2008 of the Collegiate Board of Brazil's National Agency for Sanitary Surveillance. Brochures distributed through five pharmacies and drugstores in the city of Tubarão, Southern Brazil, were collected between May and November 2008. The 17 analyzed brochures advertised 2,444 products, of which 680 were medicines. Of these, 13.7% were controlled drugs, half of them had no registration number with the Ministry of Health and 77.9% had a registration number that did not match. Information on drug indications and safety were omitted. The results showed that the drug advertising materials were not in accordance with the aforementioned resolutions. PMID:21181059

Galato, Dayani; Pereira, Greicy Borges; Valgas, Cleidson

2011-02-01

185

Performance Analysis of an Actor-Based Distributed Simulation

NASA Technical Reports Server (NTRS)

Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

Schoeffler, James D.

1998-01-01

186

Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market. PMID:25095276

Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

2014-07-29

187

Fast Electromagnetic Interference Analysis of Distributed Networks using Longitudinal Partitioning -- In this paper, a waveform relaxation algorithm for the fast electromagnetic interference analysis of distributed is provided to demonstrate the validity of the proposed algorithm. Index Terms -- Electromagnetic interference

Roy, Sourajeet

188

Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

NASA Astrophysics Data System (ADS)

Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ?-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more insight into parameter sensitivity and the conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization provide an alternative way for future MOBIDIC modeling.

Yang, J.; Castelli, F.; Chen, Y.

2014-10-01

189

Time series power flow analysis for distribution connected PV generation.

Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating potential PV impacts.

Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J. [Georgia Institute of Technology, Atlanta, GA; Smith, Jeff [Electric Power Research Institute, Knoxville, TN; Dugan, Roger [Electric Power Research Institute, Knoxville, TN

2013-01-01

190

Analysis of dilepton angular distributions in a parity breaking medium

NASA Astrophysics Data System (ADS)

We investigate how local parity breaking due to large topological fluctuations may affect hadron physics. A modified dispersion relation is derived for the lightest vector mesons ? and ?. They exhibit a mass splitting depending on their polarization. We present a detailed analysis of the angular distribution associated to the lepton pairs created from these mesons searching for polarization dependencies. We propose two angular variables that carry information related to the parity breaking effect. Possible signatures for experimental detection of local parity breaking that could potentially be seen by the PHENIX and STAR collaborations are discussed.

Andrianov, A. A.; Andrianov, V. A.; Espriu, D.; Planells, X.

2014-08-01

191

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Patrick Rice; Jim Harrington

2009-01-23

192

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Harrington, Jim W [Los Alamos National Laboratory; Rice, Patrick R [Los Alamos National Laboratory

2008-01-01

193

Clustering Analysis of Seismicity and Aftershock Identification

We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)] based on the space-time-magnitude nearest-neighbor distance {eta} between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance {eta} has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of {eta} is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California.

Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry [Department of Mathematics and Statistics, University of Nevada, Reno, Nevada 89557-0084 (United States); Departments of Mathematics and Earth and Atmospheric Sciences, Purdue University, West Lafayette, Indiana 47907-1395 (United States); Institute of Geophysics and Planetary Physics, and Department of Earth and Space Sciences, University of California Los Angeles, 3845 Slichter Hall, Los Angeles, California 90095-1567 (United States)

2008-07-04

194

Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters

1 Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters Sriram power distribution system of a next generation transport aircraft is addressed. Detailed analysis with the analysis of subsystem integration in power distribution systems of next generation transport aircraft

Lindner, Douglas K.

195

Circularly symmetric distributed feedback semiconductor laser: An analysis

We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describe the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find that the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser---it should have a superb quality output beam and is well-suited for array operation.

Erdogan, T.; Hall, D.G. (The Institute of Optics, University of Rochester, Rochester, New York 14627 (USA))

1990-08-15

196

Circularly symmetric distributed feedback semiconductor laser: An analysis

We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describes the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser-it should have a superb quality output beam and is well-suited for array operation.

Erdogan, T.; Hall, D.G.

1990-08-15

197

Analysis of Fuel Ethanol Transportation Activity and Potential Distribution Constraints

This paper provides an analysis of fuel ethanol transportation activity and potential distribution constraints if the total 36 billion gallons of renewable fuel use by 2022 is mandated by EPA under the Energy Independence and Security Act (EISA) of 2007. Ethanol transport by domestic truck, marine, and rail distribution systems from ethanol refineries to blending terminals is estimated using Oak Ridge National Laboratory s (ORNL s) North American Infrastructure Network Model. Most supply and demand data provided by EPA were geo-coded and using available commercial sources the transportation infrastructure network was updated. The percentage increases in ton-mile movements by rail, waterways, and highways in 2022 are estimated to be 2.8%, 0.6%, and 0.13%, respectively, compared to the corresponding 2005 total domestic flows by various modes. Overall, a significantly higher level of future ethanol demand would have minimal impacts on transportation infrastructure. However, there will be spatial impacts and a significant level of investment required because of a considerable increase in rail traffic from refineries to ethanol distribution terminals.

Das, Sujit [ORNL; Peterson, Bruce E [ORNL; Chin, Shih-Miao [ORNL

2010-01-01

198

Reliability analysis of a structural ceramic combustion chamber

NASA Technical Reports Server (NTRS)

The Weibull modulus, fracture toughness and thermal properties of a silicon nitride material used to make a gas turbine combustor were experimentally measured. The location and nature of failure origins resulting from bend tests were determined with fractographic analysis. The measured Weibull parameters were used along with thermal and stress analysis to determine failure probabilities of the combustor with the CARES design code. The effect of data censoring, FEM mesh refinement, and fracture criterion were considered in the analysis.

Salem, Jonathan A.; Manderscheid, Jane M.; Freedman, Marc R.; Gyekenyesi, John P.

1990-01-01

199

Estimation of the parameters of the Weibull distribution from multi-censored samples

. 24305 19. 00000 9. 86501 5. 00000 3. 09079 2. 39531 2. 13869 1. 83023 1. 58888 1. 23884 1. 00000 0. 82849 0. 70041 0. 64803 0. 60174 0. 57378 0. 52381 0. 46100 0. 40948 0. 38006 0. 36660 0. 34779 0. 33048 0. 29971 0. 27324 0. 25029...

Sprinkle, Edgar Eugene

2012-06-07

200

Software component quality has a major influence in software development project performances such as lead-time, time to market and cost. It also affects the other projects within the organization, the people assigned into the projects and the organization in general. Software development organization must have indication and prediction about software component quality and project performances in general. One of the

Lovre Hribar

2009-01-01

201

FITTING WEIBULL AND LOGNORMAL DISTRIBUTIONS TO MEDIUM-DENSITY FIBERBOARD FIBER AND WOOD

be accurately detected and measured. Rotation age, especially juvenile and adult woods (Haygreen and Bowyer 1994 of fit tests, MDF fiber, maximum likelihood estimation, non-parametric confi- dence bands, probability in the pulp and paper industry. Compared with other automated techniques, im- age analyzers have high accuracy

202

Fractal Analysis of Aerosol Mass-Size Distribution

NASA Astrophysics Data System (ADS)

Fractal geometry has been widely used in description of complicated natural phenomena. The objectives of this study were (i) to apply fractal scaling for aerosols mass-size distribution, (ii) to study one and two-domain fractal analyses of aerosols mass-size distribution and compare these two approaches. A total of 20 of different elements were considered in which Mg, Al, Ca, Si, K, and Fe were the main compositions of atmospheric aerosols over the Mount Yulongxue Region, 15 km north of Lijiang in China, and account for more than 82% of a total of 20 elements. For one-domain fractal analysis, fractal dimension changed from 2.213 for Fe to 2.874 for Zn, and was significantly correlated with the total mass of aerosols of size less than or equal to 0.25 ?m (R2=0.98). The goodness of fit (R2) was in the range of 0.698 for Cu to 0.996 for Ca. Elements such as Mg and Ca showed one-fractal domain completely, whereas other elements indicated more than one fractal domain. For two-domain fractal analysis, D1 and D2 covered the small and large aerosol sizes, respectively. D1 changed from 1.093 for Mn to 2.748 for Zn, and D2 changed from 2.386 for Fe to 2.935 for Zn. The goodness of fit for two-domain fractal analysis was greater than 0.96. For all samples except Ca, D1 was less than D2, and for 16 elements, dc which was the cutoff of the whole domain was between 0 and 1 ?m. Acknowledgement The authors are thankful to Dr. Zhen-xing Shen, Department of Environmental Science and Engineering of Xi'an Jiaotong University, for providing the data set used in this study.

Ghanbarian-Alavijeh, Behzad; Liaghat, Abdolmajid

2010-05-01

203

Prediction of the Inert Strength Distribution of Si3N4 Diesel Valves

Censored Weibull strength distributions were generated with NT551 silicon nitride four-point flexure data using the ASTM C1161-B and 5.0 mm diameter cylindrical specimens. Utilizing finite element models and AlliedSignal's life prediction codes, the inert or fast fracture strength failure probability of a ceramic diesel valve was estimated from these data sets. The failure probability prediction derived from each data set were found to be more conservative than valve strength data. Fractographic analysis of the test specimens and valves showed that the cylindrical specimens failed from a different flaw population than the prismatic flexure bars and the valves. The study emphasizes the prerequisite of having coincident flaw populations homogeneously distributed in both the test specimen and the ceramic component. Lastly, it suggests that unless material homogeneity exists, that any meaningful life prediction or reliability analysis of a component may not be possible.

Andrews, M.J.; Breder, K.; Wereszczak, A.A.

1999-01-25

204

Phylogenetic analysis on the soil bacteria distributed in karst forest

Phylogenetic composition of bacterial community in soil of a karst forest was analyzed by culture-independent molecular approach. The bacterial 16S rRNA gene was amplified directly from soil DNA and cloned to generate a library. After screening the clone library by RFLP, 16S rRNA genes of representative clones were sequenced and the bacterial community was analyzed phylogenetically. The 16S rRNA gene inserts of 190 clones randomly selected were analyzed by RFLP and generated 126 different RFLP types. After sequencing, 126 non-chimeric sequences were obtained, generating 113 phylotypes. Phylogenetic analysis revealed that the bacteria distributed in soil of the karst forest included the members assigning into Proteobacteria, Acidobacteria, Planctomycetes, Chloroflexi (Green nonsulfur bacteria), Bacteroidetes, Verrucomicrobia, Nitrospirae, Actinobacteria (High G+C Gram-positive bacteria), Firmicutes (Low G+C Gram-positive bacteria) and candidate divisions (including the SPAM and GN08). PMID:24031430

Zhou, JunPei; Huang, Ying; Mo, MingHe

2009-01-01

205

Optimized pair distribution function analysis of gold nanoparticles

NASA Astrophysics Data System (ADS)

A fast and efficient method for creating quantitative pair distribution functions (PDFs) was developed for use on a range of sample types. A volume correction was modelled in parallel with work on an optimized PDF calculation method. It was shown that the optimized calculation method eliminated the need for specific physical corrections which otherwise contaminate PDFs, thus greatly improving the efficiency of PDF calculation. Proof of this principle and of the PDF's quantitative accuracy was demonstrated using a standard sample, bulk gold. To demonstrate the versatility and power of the new PDF analysis method, non standard samples consisting of gold nanopartic1es of various sizes were analyzed. For the first time, properties such as thermal expansion, lattice constant and Debye-Waller factors of these nanopartic1es were measured via the optimized PDF method and are presented here.

Pritchard, Bayden

206

Size distribution measurement for densely binding bubbles via image analysis

NASA Astrophysics Data System (ADS)

For densely binding bubble clusters, conventional image analysis methods are unable to provide an accurate measurement of the bubble size distribution because of the difficulties with clearly identifying the outline edges of individual bubbles. In contrast, the bright centroids of individual bubbles can be distinctly defined and thus accurately measured. By taking this advantage, we developed a new measurement method based on a linear relationship between the bubble radius and the radius of its bright centroid so to avoid the need to identify the bubble outline edges. The linear relationship and method were thoroughly tested for 2D bubble clusters in a highly binding condition and found to be effective and robust for measuring the bubble sizes.

Ma, Ye; Yan, Guanxi; Scheuermann, Alexander; Bringemeier, Detlef; Kong, Xiang-Zhao; Li, Ling

2014-12-01

207

Monolithic ceramic analysis using the SCARE program

NASA Technical Reports Server (NTRS)

The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.

Manderscheid, Jane M.

1988-01-01

208

Phylogenetic analysis reveals a scattered distribution of autumn colours

Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636

Archetti, Marco

2009-01-01

209

A meta-analysis of parton distribution functions

NASA Astrophysics Data System (ADS)

A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+? s uncertainty at a common QCD coupling strength of 0.118.

Gao, Jun; Nadolsky, Pavel

2014-07-01

210

A Distributed Flocking Approach for Information Stream Clustering Analysis

Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

2006-01-01

211

Silk Fiber Mechanics from Multiscale Force Distribution Analysis

Here we decipher the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach incorporates molecular dynamics and finite element simulations. Therefore, the approach allows the analysis of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular and continuum mechanics. We thereby dissect the contributions from the nanoscale building blocks, the soft amorphous and the strong crystalline subunits, to silk fiber mechanics. We identify the amorphous subunits not only to give rise to high elasticity, but to also ensure efficient stress homogenization through the friction between entangled chains, which also allows the crystals to withstand stresses as high as 2 GPa in the context of the amorphous matrix. We show that the maximal toughness of silk is achieved at 10–40% crystallinity depending on the distribution of crystals in the fiber. We also determined a serial arrangement of the crystalline and amorphous subunits in lamellae to outperform a random or a parallel arrangement, putting forward what we believe to be a new structural model for silk and other semicrystalline materials. The multiscale approach, not requiring any empirical parameters, is applicable to other partially ordered polymeric systems. Hence, it is an efficient tool for the design of artificial silk fibers. PMID:21354403

Cetinkaya, Murat; Xiao, Senbo; Markert, Bernd; Stacklies, Wolfram; Gräter, Frauke

2011-01-01

212

NSDL National Science Digital Library

This online, interactive lesson on distributions provides examples, exercises, and applets which explore the basic types of probability distributions and the ways distributions can be defined using density functions, distribution functions, and quantile functions.

Siegrist, Kyle

213

and Kromp, 1991; Seguro and Lambert, 2000; Clarke, significant effect on the parameter estimates. The three, management, (Trustrum and Jayatilaka, 1979; Mahdi and Ashkar, 2004). and/or size effects. Several differentSocietyofAmericaJournal.PublishedbySoilScienceSocietyofAmerica.Allcopyrightsreserved. Brittle Fracture of Soil Aggregates: Weibull Models and Methods of Parameter Estimation L. Munkholm* and E

Perfect, Ed

214

The Subcellular Distribution of Small Molecules: A Meta-Analysis

To explore the extent to which current knowledge about the organelle-targeting features of small molecules may be applicable towards controlling the accumulation and distribution of exogenous chemical agents inside cells, molecules with known subcellular localization properties (as reported in the scientific literature) were compiled into a single data set. This data set was compared to a reference data set of approved drug molecules derived from the DrugBank database, and to a reference data set of random organic molecules derived from the PubChem database. Cheminformatic analysis revealed that molecules with reported subcellular localizations were comparably diverse. However, the calculated physicochemical properties of molecules reported to accumulate in different organelles were markedly overlapping. In relation to the reference sets of Drug Bank and Pubchem molecules, molecules with reported subcellular localizations were biased towards larger, more complex chemical structures possessing multiple ionizable functional groups and higher lipophilicity. Stratifying molecules based on molecular weight revealed that many physicochemical properties trends associated with specific organelles were reversed in smaller vs. larger molecules. Most likely, these reversed trends are due to the different transport mechanisms determining the subcellular localization of molecules of different sizes. Molecular weight can be dramatically altered by tagging molecules with fluorophores or by incorporating organelle targeting motifs. Generally, in order to better exploit structure-localization relationships, subcellular targeting strategies would benefit from analysis of the biodistribution effects resulting from variations in the size of the molecules. PMID:21774504

Zheng, Nan; Tsai, Hobart Ng; Zhang, Xinyuan; Shedden, Kerby; Rosania, Gus R.

2011-01-01

215

Non resonant transmission modelling with Statistical modal Energy distribution Analysis

Statistical modal Energy distribution Analysis (SmEdA) can be used as an alternative to Statistical Energy Analysis for describing subsystems with low modal overlap. In its original form, SmEdA predicts the power flow exchanged between the resonant modes of different subsystems. In the case of sound transmission through a thin structure, it is well-known that the non resonant response of the structure plays a significant role in transmission below the critical frequency. In this paper, we present an extension of SmEdA that takes into account the contributions of the non resonant modes of a thin structure. The dual modal formulation (DMF) is used to describe the behaviour of two acoustic cavities separated by a thin structure, with prior knowledge of the modal basis of each subsystem. Condensation in the DMF equations is achieved on the amplitudes of the non resonant modes and a new coupling scheme between the resonant modes of the three subsystems is obtained after several simplifications. We show that the co...

Maxit, Laurent; Totaro, Nicolas; Guyader, Jean-Louis

2014-01-01

216

Non resonant transmission modelling with statistical modal energy distribution analysis

NASA Astrophysics Data System (ADS)

Statistical modal Energy distribution Analysis (SmEdA) can be used as an alternative to Statistical Energy Analysis for describing subsystems with low modal overlap. In its original form, SmEdA predicts the power flow exchanged between the resonant modes of different subsystems. In the case of sound transmission through a thin structure, it is well-known that the non resonant response of the structure plays a significant role in transmission below the critical frequency. In this paper, we present an extension of SmEdA that takes into account the contributions of the non resonant modes of a thin structure. The dual modal formulation (DMF) is used to describe the behaviour of two acoustic cavities separated by a thin structure, with prior knowledge of the modal basis of each subsystem. Condensation in the DMF equations is achieved on the amplitudes of the non resonant modes and a new coupling scheme between the resonant modes of the three subsystems is obtained after several simplifications. We show that the contribution of the non resonant panel mode results in coupling the cavity modes of stiffness type, characterised by the mode shapes of both the cavities and the structure. Comparisons with reference results demonstrate that the present approach can take into account the non resonant contributions of the structure in the evaluation of the transmission loss.

Maxit, L.; Ege, K.; Totaro, N.; Guyader, J. L.

2014-01-01

217

\\u000a The Distributed Object Group Framework(DOGF) we constructed supports the grouping of distributed objects that are required\\u000a for distributed application. From the DOGF, we manage distributed application as a logical single view by applying the concept\\u000a of object group, therefore the framework can provide distributed transparency for client’ request and binding service between\\/among\\u000a objects. The DOGF also has an adaptive structure

Chang-sun Shin; Chang-won Jeong; Su-chong Joo

2004-01-01

218

CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES

Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably well developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.

S. Bandopadhyay; N. Nagabhushana

2003-10-01

219

The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull,

F. Caleyo; J. C. Velázquez; A. Valor; J. M. Hallen

2009-01-01

220

A mathematical analysis of the DCT coefficient distributions for images

Over the past two decades, there have been various studies on the distributions of the DCT coefficients for images. However, they have concentrated only on fitting the empirical data from some standard pictures with a variety of well-known statistical distributions, and then comparing their goodness of fit. The Laplacian distribution is the dominant choice balancing simplicity of the model and

Edmund Y. Lam; Joseph W. Goodman

2000-01-01

221

Fourier analysis of polar cap electric field and current distributions

NASA Technical Reports Server (NTRS)

A theoretical study of high-latitude electric fields and currents, using analytic Fourier analysis methods, is conducted. A two-dimensional planar model of the ionosphere with an enhanced conductivity auroral belt and field-aligned currents at the edges is employed. Two separate topics are treated. A field-aligned current element near the cusp region of the polar cap is included to investigate the modifications to the convection pattern by the east-west component of the interplanetary magnetic field. It is shown that a sizable one-cell structure is induced near the cusp which diverts equipotential contours to the dawnside or duskside, depending on the sign of the cusp current. This produces characteristic dawn-dusk asymmetries to the electric field that have been previously observed over the polar cap. The second topic is concerned with the electric field configuration obtained in the limit of perfect shielding, where the field is totally excluded equatorward of the auroral oval. When realistic field-aligned current distributions are used, the result is to produce severely distorted, crescent-shaped equipotential contours over the cap. Exact, analytic formulae applicable to this case are also provided.

Barbosa, D. D.

1984-01-01

222

Emergence of skew distributions in controlled growth processes

NASA Astrophysics Data System (ADS)

Starting from a master equation, we derive the evolution equation for the size distribution of elements in an evolving system, where each element can grow, divide into two, and produce new elements. We then probe general solutions of the evolution equation, to obtain such skew distributions as power-law, log-normal, and Weibull distributions, depending on the growth or division and production. Specifically, repeated production of elements of uniform size leads to power-law distributions, whereas production of elements with the size distributed according to the current distribution as well as no production of new elements results in log-normal distributions. Finally, division into two, or binary fission, bears Weibull distributions. Numerical simulations are also carried out, confirming the validity of the obtained solutions.

Goh, Segun; Kwon, H. W.; Choi, M. Y.; Fortin, J.-Y.

2010-12-01

223

Emergence of skew distributions in controlled growth processes.

Starting from a master equation, we derive the evolution equation for the size distribution of elements in an evolving system, where each element can grow, divide into two, and produce new elements. We then probe general solutions of the evolution equation, to obtain such skew distributions as power-law, log-normal, and Weibull distributions, depending on the growth or division and production. Specifically, repeated production of elements of uniform size leads to power-law distributions, whereas production of elements with the size distributed according to the current distribution as well as no production of new elements results in log-normal distributions. Finally, division into two, or binary fission, bears Weibull distributions. Numerical simulations are also carried out, confirming the validity of the obtained solutions. PMID:21230652

Goh, Segun; Kwon, H W; Choi, M Y; Fortin, J-Y

2010-12-01

224

Analysis of current distribution in a large superconductor

NASA Astrophysics Data System (ADS)

An imbalanced current distribution which is often observed in cable-in-conduit (CIC) superconductors composed of multistaged, triplet type sub-cables, can deteriorate the performance of the coils. It is, hence very important to analyze the current distribution in a superconductor and find out methods to realize a homogeneous current distribution in the conductor. We apply magnetic flux conservation in a loop contoured by electric center lines of filaments in two arbitrary strands located on adjacent layers in a coaxial multilayer superconductor, and thereby analyze the current distribution in the conductor. A generalized formula governing the current distribution can be described as explicit functions of the superconductor construction parameters, such as twist pitch, twist direction and radius of individual layer. We numerically analyze a homogeneous current distribution as a function of the twist pitches of layers, using the fundamental formula. Moreover, it is demonstrated that we can control current distribution in the coaxial superconductor.

Hamajima, Takataro; Alamgir, A. K. M.; Harada, Naoyuki; Tsuda, Makoto; Ono, Michitaka; Takano, Hirohisa

225

Statistical analysis and modelling of small satellite reliability

NASA Astrophysics Data System (ADS)

This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

Guo, Jian; Monas, Liora; Gill, Eberhard

2014-05-01

226

Statistical distribution of mechanical properties for three graphite-epoxy material systems

NASA Technical Reports Server (NTRS)

Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

Reese, C.; Sorem, J., Jr.

1981-01-01

227

Analysis of current distribution in a large superconductor

An imbalanced current distribution which is often observed in cable-in-conduit (CIC) superconductors composed of multistaged, triplet type sub-cables, can deteriorate the performance of the coils. It is, hence very important to analyze the current distribution in a superconductor and find out methods to realize a homogeneous current distribution in the conductor. We apply magnetic flux conservation in a loop contoured

Takataro Hamajima; A. K. M. Alamgir; Naoyuki Harada; Makoto Tsuda; Michitaka Ono; Hirohisa Takano

2000-01-01

228

Causality and sensitivity analysis in distributed design simulation

Numerous collaborative design frameworks have been developed to accelerate the product development, and recently environments for building distributed simulations have been proposed. For example, a simulation framework ...

Kim, Jaehyun, 1970-

2002-01-01

229

Distributed object-oriented nonlinear finite element analysis

This research extends an existing general-purpose object-oriented finite element application program to accommodate parallel processing, develops a new nonlinear substructuring algorithm, and implements the algorithm in the parallel framework. The new programming design distributes the elements and performs the element computations concurrently. Consequently, not just the elements are distributed to the available processors, but also the associated nodes and final

Hung-Ming Chen

2002-01-01

230

Analysis Model for Domestic Hot Water Distribution Systems: Preprint

A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.

Maguire, J.; Krarti, M.; Fang, X.

2011-11-01

231

A global analysis of root distributions for terrestrial biomes

Understanding and predicting ecosystem functioning (e.g., carbon and water fluxes) and the role of soils in carbon storage requires an accurate assessment of plant rooting distributions. Here, in a comprehensive literature synthesis, we analyze rooting patterns for terrestrial biomes and compare distributions for various plant functional groups. We compiled a database of 250 root studies, subdividing suitable results into 11

R. B. Jackson; J. Canadell; J. R. Ehleringer; H. A. Mooney; O. E. Sala; E. D. Schulze

1996-01-01

232

Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study

Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study of malaria infected red blood cells with an accuracy that is within 1.25% of the diagnostics decisions made) Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study. PLoS ONE 7

Jalali. Bahram

233

Analysis of soil carbon transit times and age distributions using network theories

Analysis of soil carbon transit times and age distributions using network theories Stefano Manzoni be approximated by networks of linear compartments, permitting theoretical analysis of transit time (i systems, and models assuming a continuous distribution of decay constants. We also derive the transit time

Katul, Gabriel

234

DATALITE: a Distributed Architecture for Traffic Analysis via Light-weight Traffic Digest

DATALITE: a Distributed Architecture for Traffic Analysis via Light-weight Traffic Digest for Traffic Analysis via LIght- weight Traffic digEst, which introduces a set of new distributed algorithms digests (TD's) amongst the network nodes. A TD for N packets only requires O(loglog N) bits of memory

Chao, Jonathan

235

ROBUST BAYESIAN ANALYSIS WITH DISTRIBUTION BANDS 1 Sanjib Basu and Anirban DasGupta

`, robustness of Bayesian analysis is considered when the prior cdf of ` lies in the distribution band \\GammaROBUST BAYESIAN ANALYSIS WITH DISTRIBUTION BANDS 1 Sanjib Basu and Anirban DasGupta Received of a fixed cdf such as Kolmogorov and L`evy neighborhoods. General methods are described for finding ranges

Basu, Sanjib

236

A diameter distribution model for even-aged beech in Denmark Thomas Nord-Larsen a,*, Quang V. Cao b

A diameter distribution model for even-aged beech in Denmark Thomas Nord-Larsen a,*, Quang V. Cao b 2006 Abstract We developed a diameter distribution model for even-aged stands of European beech in Denmark using the Weibull distribution. The model parameters were estimated using a large dataset from

Cao, Quang V.

237

Analysis of DNS Cache Effects on Query Distribution

This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally. PMID:24396313

2013-01-01

238

Analysis of DNS cache effects on query distribution.

This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally. PMID:24396313

Wang, Zheng

2013-01-01

239

Analysis of fire causes spatial and temporal distribution in France

NASA Astrophysics Data System (ADS)

The goal of the paper was to create a statistical model explaining spatial and temporal occurrences of forest fires depending on their causes. In the forest fire causes databases, fire ignitions were located according to the third level of the 2003 Nomenclature of Territorial Units for Statistics (NUTS 3). 15,469 records were considered on the 2005 - 2008 period on the French territory. Global fire ignition density as well as fire ignition cause densities related to lightning, negligence and arson were considered. Descriptive variables (land cover, topography and climate) were used to divide the whole country into homogeneous regions. According to a clustering based on multidimensional projection (Sammon's projection), NUTS 3 presenting the nearest characteristics in terms of land cover, topography or climate conditions were merged into regions. The analysis of these variables led to 3 regions: the northwest France, the eastern central France and the Mediterranean region. In this paper, Partial Least Square regression was performed on each region to identify the main explanatory spatial variables and to model the fire density due to the different causes. 32 explanatory variables relative to human and biophysical variables were used in these analyses. Results of the statistical analyses performed on the spatial distribution of fire density due to the different types of cause in the different French regions showed that: (i) Fire density due to natural cause was mainly favoured by land-cover variables (such as the proportion of overall vegetation, the proportion of shrubland, the surface area of farms) and was mainly mitigated by some agricultural variables (such as proportion of non-irrigated crops or pasture, farm density) ; (ii) Fire density due to negligence was mainly favoured by network and socio-economic variables and was mainly mitigated by land-cover and climate variables depending on the region ; (iii) Fire density due to arson was mainly favoured by network, topographic and socio-economic variables and was mainly mitigated by climate variables depending on the region. Causes due to negligence or arson were maybe too global and to get better results, more detailed causes may be used. Moreover, in most works, the statistical analyses were carried out on georeferenced fire ignition points allowing the use of more accurate explanatory variables such as the distance to the road, distance to the forest, etc.

Long, M.; Ganteaume, A.; Jappiot, M.; Andrienko, G.; Andrienko, N.

2012-04-01

240

Determination analysis of energy conservation standards for distribution transformers

This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

1996-07-01

241

Numerical Analysis of a Cold Air Distribution System

Cold air distribution systems may reduce the operating energy consumption of air-conditioned air supply system and improve the outside air volume percentages and indoor air quality. However, indoor temperature patterns and velocity field are easily...

Zhu, L.; Li, R.; Yuan, D.

2006-01-01

242

Probability distribution of energetic-statistical size effect in quasibrittle fracture

The physical sources of randomness in quasibrittle fracture described by the cohesive crack model are discussed and theoretical arguments for the basic form of the probability distribution are presented. The probability distribution of the size effect on the nominal strength of structures made of heterogeneous quasibrittle materials is derived, under certain simplifying assumptions, from the nonlocal generalization of Weibull theory.

Zdenek P. Bazant

243

Probability distributions for offshore wind speeds Eugene C. Morgan a,*, Matthew Lackner b

in tapping this tremendous power source. In wind turbine design and site planning, the probability Wind turbine energy output Weibull distribution Extreme wind a b s t r a c t In planning offshore wind, the probability distribution of wind speed serves as the primary substitute for data when estimating design

Vogel, Richard M.

244

The purpose of this report is to provide a set of reference or standard values of wind profiles, wind speed distributions and their effects on wind turbine performance for engineering design applications. Based on measured Weibull distribution parameters, representative average, low, and high variance data are given for height profiles of mean, 25 percentile, and 75 percentile wind speeds; and

C. G. Justus; W. R. Hargraves; A. Mikhail

1976-01-01

245

Thermal Analysis of Antenna Structures. Part 2: Panel Temperature Distribution

NASA Technical Reports Server (NTRS)

This article is the second in a series that analyzes the temperature distribution in microwave antennas. An analytical solution in a series form is obtained for the temperature distribution in a flat plate analogous to an antenna surface panel under arbitrary temperature and boundary conditions. The solution includes the effects of radiation and air convection from the plate. Good agreement is obtained between the numerical and analytical solutions.

Schonfeld, D.; Lansing, F. L.

1983-01-01

246

Measured Sea Ice Thickness Distributions Reformulated for Model Applications

NASA Astrophysics Data System (ADS)

With the increased interest in the state and fate of sea ice as a climate indicator, both old and new sea ice thickness techniques are being investigated. Additionally, there are a number of requests by the modeling community to provide sea ice thickness measurements in a model-friendly format. In particular, there is a need for consolidating the tens of thousands of sea ice thickness point measurements into regional and seasonal distributions in a form that can be utilized in global climate model (GCMs) studies, data assimilation, and model validation. In this presentation, we address these needs by considering the 20,000 ship-based sea ice thickness measurements from around Antarctica from 1980-present currently catalogued and archived through the Antarctic Sea ice Processes and Climate (ASPeCt) group under the Scientific Committee for Antarctic Research (SCAR). The technique used is three-fold. First, extract ship observations from each station as a six-level distribution with up to three level ice categories and up to three deformed ice categories. Associate an uncertainty with each measurement to track the data quality. Next, consolidate the station distributions into regional and seasonal distributions in weighted probability distributions. Finally, fit a two-parameter Weibull curve to each of the distributions. Quality control this fit with a thickness conserving estimate which quantifies the deficit or surplus cumulative thickness of the curve fit relative to the original data. The outcome is the reduction of a large point-measurement data set to a well-known mathematical function and a short table with three numbers for each region and season. The numbers represent the shape, scale, and accuracy of a Weibull distribution relative to an archived data set. Additional meta-data information includes the number of stations and their propagated uncertainties for use in the validation process and data assimilation issues. We present results of this method using the ASPeCt data archive. This compact data description is a computationally efficient means of storing and regenerating regional and seasonal sea ice thickness distributions within numerical models. The quality control steps provide a means of tracking the uncertainty of both the measurement and the analysis used to arrive at the model input parameters.

Geiger, C. A.; Worby, A. P.; Deliberty, T.; Ackley, S. F.; van Woert, M.

2006-12-01

247

Inductance and Current Distribution Analysis of a Prototype HTS Cable

NASA Astrophysics Data System (ADS)

This project is partly supported by NSFC Grant 51207146, RAEng Research Exchange scheme of UK and EPSRC EP/K01496X/1. Superconducting cable is an emerging technology for electricity power transmission. Since the high power capacity HTS transmission cables are manufactured using a multi-layer conductor structure, the current distribution among the multilayer structure would be nonuniform without proper optimization and hence lead to large transmission losses. Therefore a novel optimization method has been developed to achieve evenly distributed current among different layers considering the HTS cable structure parameters: radius, pitch angle and winding direction which determine the self and mutual inductance. A prototype HTS cable has been built using BSCCO tape and tested to validate the design the optimal design method. A superconductor characterization system has been developed using the Labview and NI data acquisition system. It can be used to measure the AC loss and current distribution of short HTS cables.

Zhu, Jiahui; Zhang, Zhenyu; Zhang, Huiming; Zhang, Min; Qiu, Ming; Yuan, Weijia

2014-05-01

248

Analysis and machine mapping of the distribution of band recoveries

A method of calculating distance and bearing from banding site to recovery location based on the solution of a spherical triangle is presented. X and Y distances on an ordinate grid were applied to computer plotting of recoveries on a map. The advantages and disadvantages of tables of recoveries by State or degree block, axial lines, and distance of recovery from banding site for presentation and comparison of the spatial distribution of band recoveries are discussed. A special web-shaped partition formed by concentric circles about the point of banding and great circles at 30-degree intervals through the point of banding has certain advantages over other methods. Comparison of distributions by means of a X? contingency test is illustrated. The statistic V = X?/N can be used as a measure of difference between two distributions of band recoveries and its possible use is illustrated as a measure of the degree of migrational homing.

Cowardin, L.M.

1977-01-01

249

A fractal approach to dynamic inference and distribution analysis

Event-distributions inform scientists about the variability and dispersion of repeated measurements. This dispersion can be understood from a complex systems perspective, and quantified in terms of fractal geometry. The key premise is that a distribution's shape reveals information about the governing dynamics of the system that gave rise to the distribution. Two categories of characteristic dynamics are distinguished: additive systems governed by component-dominant dynamics and multiplicative or interdependent systems governed by interaction-dominant dynamics. A logic by which systems governed by interaction-dominant dynamics are expected to yield mixtures of lognormal and inverse power-law samples is discussed. These mixtures are described by a so-called cocktail model of response times derived from human cognitive performances. The overarching goals of this article are twofold: First, to offer readers an introduction to this theoretical perspective and second, to offer an overview of the related statistical methods. PMID:23372552

van Rooij, Marieke M. J. W.; Nash, Bertha A.; Rajaraman, Srinivasan; Holden, John G.

2013-01-01

250

Evaluation of frequency distributions for flood hazard analysis

Many different frequency distributions and fitting methods are used to determine the magnitude and frequency of floods and rainfall. Ten different combinations of frequency distributions and fitting methods are evaluated by summarizing the differences in the 0.002 exceedance probability quantile (500-year event), presenting graphical displays of the 10 estimates of the 0.002 quantile, and performing statistical tests to determine if differences are statistically significant. This evaluation indicated there are some statistically significant differences among the methods but, from an engineering standpoint, these differences may not be significant.

Thomas, Wilbert O., Jr.; Kuriki, Minoru; Suetsugi, Tadashi

1995-01-01

251

Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

ERIC Educational Resources Information Center

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

2009-01-01

252

THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

253

Preliminary distributional analysis of US endangered bird species

A first exploration of applications of ecological niche modeling and geographic distributional prediction to endangered species protection is developed. Foci of richness of endangered bird species are identified in coastal California and along the southern fringe of the United States. Species included on the Endangered Species List on the basis of peripheral populations inflate these concentrations considerably. Species without protection

MANDALINE E. GODOWN; A. TOWNSEND PETERSON

2000-01-01

254

The Elementary and Secondary Education Act: A Distributional Analysis.

ERIC Educational Resources Information Center

This study analyzes interstate redistribution of Federal tax money under Title One of the Elementary and Secondary Education Act of 1965. First, the consistency of the criteria used to distribute funds is studied to see if people of similar financial positions are treated "qually. Results show that when compared with an alternative--the Orshansky…

Barkin, David; Hettich, Walter

255

Analysis of vegetation distribution in Interior Alaska and sensitivity to

distribution of four major vegetation types: tundra, deciduous forest, black spruce forest and white spruce by elevation, precipitation and south to north aspect. At the second step, forest was separated into deciduous temperatures exceeded a critical limit (+2 Â°C). Deciduous forests expand their range the most when any two

McGuire, A. David

256

Metagenomic Analysis of Water Distribution System Bacterial Communities

The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...

257

ANALYSIS OF COARSELY GROUPED DATA FROM THE LOGNORMAL DISTRIBUTION

A missing information technique is applied to blood lead data that is both grouped and assumed to be lognormally distributed. These maximum likelihood techniques are extended from the simple lognormal case to obtain solutions for a general linear model case. Various models are fi...

258

Distribution of frequencies of digits via multifractal analysis

We study the Hausdorff dimension of a large class of sets in the real line defined in terms of the distribution of frequencies of digits for the representation in some integer base. In particular, our results unify and extend classical work of Borel, Besicovitch, Eggleston, and Billingsley in several directions. Our methods are based on recent results concerning the multifractal

L. Barreira; B. Saussol; J. Schmeling

2002-01-01

259

Rényi dimensions analysis of soil particle-size distributions

Dynamism and complexity are terms that have been related to soils. New approaches and points of view have been researched during past decades considering soil’s multiple functions and integrating physical, chemical, and biological soil attributes. Also new mathematical models have been explored such as fractal sets and multifractal measures. Characterization of soil particle-size distribution is a key move towards modeling

Elo??sa Montero

2005-01-01

260

ERIC Data Base; Pagination Field Frequency Distribution Analysis.

ERIC Educational Resources Information Center

A definitive study of the sizes of documents in the ERIC Data Base is reported by the ERIC Processing and Reference Facility. This is a statistical and frequency distribution study encompassing every item in the file whose record contains pagination data in machine readable form. The study provides pagination data that could be used by present and…

Brandhorst, Wesley T.; Marra, Samuel J.; Price, Douglas S.

261

Conduits and dike distribution analysis in San Rafael Swell, Utah

NASA Astrophysics Data System (ADS)

Volcanic fields generally consist of scattered monogenetic volcanoes, such as cinder cones and maars. The temporal and spatial distribution of monogenetic volcanoes and probability of future activity within volcanic fields is studied with the goals of understanding the origins of these volcano groups, and forecasting potential future volcanic hazards. The subsurface magmatic plumbing systems associated with volcanic fields, however, are rarely observed or studied. Therefore, we investigated a highly eroded and exposed magmatic plumbing system on the San Rafael Swell (UT) that consists of dikes, volcano conduits and sills. San Rafael Swell is part of the Colorado Plateau and is located east of the Rocky Mountain seismic belt and the Basin and Range. The overburden thickness at the time of mafic magma intrusion (Pliocene; ca. 4 Ma) into Jurassic sandstone is estimated to be ~800 m based on paleotopographical reconstructions. Based on a geologic map by P. Delaney and colleagues, and new field research, a total of 63 conduits are mapped in this former volcanic field. The conduits each reveal features of root zone and / or lower diatremes, including rapid dike expansion, peperite and brecciated intrusive and host rocks. Recrystallized baked zone of host rock is also observed around many conduits. Most conduits are basaltic or shonkinitic with thickness of >10 m and associated with feeder dikes intruded along N-S trend joints in the host rock, whereas two conduits are syenitic and suggesting development from underlying cognate sills. Conduit distribution, which is analyzed by a kernel function method with elliptical bandwidth, illustrates a N-S elongate higher conduit density area regardless of the azimuth of closely distributed conduits alignment (nearest neighbor distance <200 m). In addition, dike density was calculated as total dike length in unit area (km/km^2). Conduit and sill distribution is concordant with the high dike density area. Especially, the distribution of conduits is not random with respect to the dike distribution with greater than 99% confidence on the basis of the Kolmogorov-Smirnov test. On the other hand, dike density at each conduits location also suggests that there is no threshold of dike density for conduit formation. In other words, conduits may be possible to develop from even short mapped dikes in low dike density areas. These results show effectiveness of studying volcanic vent distribution to infer the size of magmatic system below volcanic fields and highlight the uncertainty of forecasting the location of new monogenetic volcanoes in active fields, which may be associated with a single dike intrusion.

Kiyosugi, K.; Connor, C.; Wetmore, P. H.; Ferwerda, B. P.; Germa, A.

2011-12-01

262

CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

NASA Astrophysics Data System (ADS)

This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

2003-02-01

263

CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

NASA Technical Reports Server (NTRS)

This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

2003-01-01

264

ERIC Educational Resources Information Center

This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

2013-01-01

265

The distribution laws of atmospheric transmittance in the visible and IR bands

NASA Astrophysics Data System (ADS)

Distribution laws of atmospheric transmittance in the visible and IR bands have been derived by analyzing experimental data on radiation attenuation of a CO2 laser and on the visibility range. It is found that in horizontal directions the empirical distribution of transmittance in the visible and IR bands can be well-approximated using the truncated Weibull distribution. In oblique directions, transmittance can be approximated using the truncated Rayleigh distribution and modified beta-distribution.

Miliutin, E. R.; Iaremenko, Iu. I.

1991-11-01

266

Nonlinear structural analysis on distributed-memory computers

NASA Technical Reports Server (NTRS)

A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a nested dissection (or multilevel substructuring) ordering scheme; (3) parallel assembly of global matrices; and (4) a parallel sparse equation solver. The effectiveness of the strategy is assessed by applying it to thermo-mechanical postbuckling analyses of stiffened composite panels with cutouts, and nonlinear large-deflection analyses of HSCT models on Intel Paragon XP/S computers. The numerical studies presented demonstrate the advantages of nested dissection-based solvers over traditional skyline-based solvers on distributed memory machines.

Watson, Brian C.; Noor, Ahmed K.

1995-01-01

267

NASA Astrophysics Data System (ADS)

The paper considers the problem of the analysis of distributed computer systems reliability with client-server architecture. A distributed computer system is a set of hardware and software for implementing the following main functions: processing, storage, transmission and data protection. This paper discusses the distributed computer systems architecture "client-server". The paper presents the scheme of the distributed computer system functioning represented as a graph where vertices are the functional state of the system and arcs are transitions from one state to another depending on the prevailing conditions. In reliability analysis we consider such reliability indicators as the probability of the system transition in the stopping state and accidents, as well as the intensity of these transitions. The proposed model allows us to obtain correlations for the reliability parameters of the distributed computer system without any assumptions about the distribution laws of random variables and the elements number in the system.

Kovalev, I. V.; Zelenkov, P. V.; Karaseva, M. V.; Tsarev, M. Yu; Tsarev, R. Yu

2015-01-01

268

Applying Bayesian network to distribution system reliability analysis

This paper proposes a Bayesian network (BN) model for distribution-system-reliability assessment and reassessment. The zone-based FMEA technique is applied to construct the BN model. During forward propagation of probabilistic information, load-point and system reliability indices are calculated. And the weak points are identified during backward propagation for what-if studies. A dual isomorphic model is used to obtain outage probabilistic indices

Wang Chengshan; Xie Yinghua

2004-01-01

269

GSM-Distributed RTK for Precise Analysis of Speed Skiing

The applications to dynamic sports for an accurate analysis of trajectories represent a new perspective for carrier phase-based GPS positioning. Introduced in competitive skiing, the GPS technique provides all the qualitative data for a complete analysis of position\\/velocity\\/acceleration so that the measured trajectories can be compared throughout the entire track. Consequently, it can help the athletes to find the fastest

J. Skaloud; H. Gontran; B. Merminod

2004-01-01

270

Analysis of magnetic electron lens with secant hyperbolic field distribution

NASA Astrophysics Data System (ADS)

Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of the electron beam. Indicators of imaging performance of these instruments, like spatial resolution, have a strong correlation with the focal characteristics of the magnetic lenses, which in turn have been shown to be sensitive to the details of the spatial distribution of the axial magnetic field. Owing to the complexity of designing practical lenses, empirical mathematical expressions are important to obtain the desired focal properties. Thus the degree of accuracy of such models in representing the actual field distribution determines accuracy of the calculations and ultimately the performance of the lens. Historically, the mathematical models proposed by Glaser [1] and Ramberg [2] have been extensively used. In this paper the authors discuss another model with a secant-hyperbolic type magnetic field distribution function, and present a comparison between models, utilizing results from finite element-based field simulations as the reference for evaluating performance.

Pany, S. S.; Ahmed, Z.; Dubey, B. P.

2014-12-01

271

Flaw strength distributions and statistical parameters for ceramic fibers: The normal distribution

NASA Astrophysics Data System (ADS)

The present paper investigates large sets of ceramic fibre failure strengths (500 to 1000 data) produced using tensile tests on tows that contained either 500 or 1000 filaments. The probability density function was determined through acoustic emission monitoring which allowed detection and counting of filament fractures. The statistical distribution of filament strengths was described using the normal distribution. The Weibull equation was then fitted to this normal distribution for estimation of statistical parameters. A perfect agreement between both distributions was obtained, and a quite negligible scatter in statistical parameters was observed, as opposed to the wide variability that is reported in the literature. Thus it was concluded that flaw strengths are distributed normally and that the statistical parameters that were derived are the true ones. In a second step, the conventional method of estimation of Weibull parameters was applied to these sets of data and, then, to subsets selected randomly. The influence of other factors involved in the conventional method of determination of statistical parameters is discussed. It is demonstrated that selection of specimens, sample size, and method of construction of so-called Weibull plots are responsible for statistical parameters variability.

R'Mili, M.; Godin, N.; Lamon, J.

2012-05-01

272

APPROXIMATE NULL DISTRIBUTION OF THE LARGEST ROOT IN MULTIVARIATE ANALYSIS1

The greatest root distribution occurs everywhere in classical multivariate analysis, but even under the null hypothesis the exact distribution has required extensive tables or special purpose software. We describe a simple approximation, based on the Tracy–Widom distribution, that in many cases can be used instead of tables or software, at least for initial screening. The quality of approximation is studied, and its use illustrated in a variety of setttings. PMID:20526465

Johnstone, Iain M.

2010-01-01

273

Use of a moments method for the analysis of flux distributions in subcritical assemblies

A moments method has been developed for the analysis of flux distributions in subcritical neutron-multiplying assemblies. The method determines values of the asymptotic axial and radial buckling, and of the extrapolated ...

Cheng, Hsiang-Shou

1968-01-01

274

VISUAL INTERFACE FOR THE CONCEPT DISTRIBUTION ANALYSIS IN VIDEO SEARCH RESULTS

VISUAL INTERFACE FOR THE CONCEPT DISTRIBUTION ANALYSIS IN VIDEO SEARCH RESULTS Multi.simaclejeune@litii.com Keywords: Information Retrieval, Visualization, Search Results, Visual Interface. Abstract: Video media. Despite the current performance of the 'traditional' search engines, the video search engines

Paris-Sud XI, UniversitÃ© de

275

A framework is established for directly accommodating feedback interconnections of unstable distributed-parameter transfer functions in robust stability analysis via integral quadratic constraints (IQCs). This involves transfer function homotopies that are continuous in a $\\\

Michael Cantoni; Ulf T. Jonsson; Chung-Yao Kao

2012-01-01

276

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ...

Hunter, David J.

277

Brillouin optical-fiber frequency-domain analysis is a new sensing technique for the distributed measurement of temperature and strain. Extensive theoretical investigations and experimental results of distributed temperature and strain measurements demonstrate the feasibility of this new concept. In an experimental demonstration, a spatial resolution of 3 m was achieved for a 1 km-long single-mode fiber

D. Garcus; Torsten Gogolla; Katerina Krebber; Frank Schliep

1997-01-01

278

Analysis of Saturn's thermal emission at 2.2-cm wavelength: Spatial distribution of ammonia vapor

Analysis of Saturn's thermal emission at 2.2-cm wavelength: Spatial distribution of ammonia vapor A a c t This work focuses on determining the latitudinal structure of ammonia vapor in Saturn's cloud of the spa- tial distribution of ammonia vapor, since ammonia gas is the only effective opacity source

279

Progressive supranuclear palsy is characterized neuropathologically by the presence of high densities of neurofibrillary tangles in several subcortical structures. In some cases, neurofibrillary tangles have also been described in the cerebral cortex. We performed a quantitative regional and laminar analysis of the distribution of these lesions in six cases of progressive supranuclear palsy. We observed that the neurofibrillary tangle distribution

P. R. Hof; A. Delacourte; C. Bouras

1992-01-01

280

A Java-Based Distributed Computation Framework for Finite Element Analysis

Parallel computation in finite element method for structural analysis has emerged as a major technique for solving large-scale and complex problems. However, in most of the existing research, multiprocessor computers are the focus of implementation rather than the distributed computing system. With the on-going progressive and rapid development of distributed computing system, it has become a promising computation tool for

Kevin K.-H. Tseng; Buntono

2001-01-01

281

Formal description and analysis of a distributed location service for mobile ad hoc networks

In this paper, we dene a distributed abstract state machine (DASM) model of the network or routing layer protocol for mobile ad hoc networks. In conjunction with the chosen routing strategy, we propose a distributed logical topology based location service (LTLS) protocol and give a formal description and analysis of this protocol on the DASM model. The high dynamics of

Uwe Glässer; Qian-ping Gu

2005-01-01

282

This paper presents a hybrid thermal model with distributed heat sources for thermal analysis of soft magnetic composite (SMC) motors. The model uses a combination of lumped and distributed thermal parameters, which can be obtained from motor dimensions and thermal constants. The model can be used to calculate the core loss in each part, in combination with three-dimensional magnetic field

Youguang Guo; Jian Guo Zhu; Wei Wu

2005-01-01

283

in- volves the use of pressure sensors or strain gauges within, or at the edge of, a compactNonlocal approach to the analysis of the stress distribution in granular systems. II. Application within a compact, and 2 direct mapping of the den- sity distribution within a compact. The first approach

Kenkre, V.M.

284

Spatial analysis of pallid sturgeon Scaphirhynchus albus distribution in the Missouri River, South and distribution of the endangered pallid sturgeon Scaphirhynchus albus has generally been documented using radio requirements of pallid sturgeon. Introduction The pallid sturgeon Scaphirhynchus albus is a federally

285

Making distributed ALICE analysis simple using the GRID plug-in

NASA Astrophysics Data System (ADS)

We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

Gheata, A.; Gheata, M.

2012-06-01

286

NASA Technical Reports Server (NTRS)

A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

Gyekenyesi, J. P.

1986-01-01

287

Aggregate Characterization of User Behavior in Twitter and Analysis of the Retweet Graph

Most previous analysis of Twitter user behavior is focused on individual information cascades and the social followers graph. We instead study aggregate user behavior and the retweet graph with a focus on quantitative descriptions. We find that the lifetime tweet distribution is a type-II discrete Weibull stemming from a power law hazard function, the tweet rate distribution, although asymptotically power law, exhibits a lognormal cutoff over finite sample intervals, and the inter-tweet interval distribution is power law with exponential cutoff. The retweet graph is small-world and scale-free, like the social graph, but is less disassortative and has much stronger clustering. These differences are consistent with it better capturing the real-world social relationships of and trust between users. Beyond just understanding and modeling human communication patterns and social networks, applications for alternative, decentralized microblogging systems-both predicting real-word performance and detecting spam-are d...

Bild, David R; Dick, Robert P; Mao, Z Morley; Wallach, Dan S

2014-01-01

288

The Poisson distribution is the most widely recognised and commonly used distribution for cytogenetic radiation biodosimetry. However, it is recognised that, due to the complexity of radiation exposure cases, other distributions may be more properly applied. Here, the Poisson, gamma, negative binomial, beta, Neyman type-A and Hermite distributions are compared in terms of their applicability to 'real-life' radiation exposure situations. The identification of the most appropriate statistical model in each particular exposure situation more correctly characterises data. The results show that for acute, homogeneous (whole-body) exposures, the Poisson distribution can still give a good fit to the data. For localised partial-body exposures, the Neyman type-A model was found to be the most robust. Overall, no single distribution was found to be universally appropriate. A distribution-specific method of analysis of cytogenetic data is therefore recommended. Such an approach may lead potentially to more accurate biological dose estimates. PMID:23325781

Ainsbury, Elizabeth A; Vinnikov, Volodymyr A; Maznyk, Nataliya A; Lloyd, David C; Rothkamm, Kai

2013-07-01

289

Complexity analysis of pipeline mapping problems in distributed heterogeneous networks

Largescale scientific applications require using various system resources to execute complex computing pipelines in distributed networks to support collaborative research. System resources are typically shared in the Internet or over dedicated connections based on their location, availability, capability, and capacity. Optimizing the network performance of computing pipelines in such distributed environments is critical to the success of these applications. We consider two types of largescale distributed applications: (1) interactive applications where a single dataset is sequentially processed along a pipeline; and (2) streaming applications where a series of datasets continuously flow through a pipeline. The computing pipelines of these applications consist of a number of modules executed in a linear order in network environments with heterogeneous resources under different constraints. Our goal is to find an efficient mapping scheme that allocates the modules of a pipeline to network nodes for minimum endtoend delay or maximum frame rate. We formulate the pipeline mappings in distributed environments as optimization problems and categorize them into six classes with different optimization goals and mapping constraints: (1) Minimum Endtoend Delay with No Node Reuse (MEDNNR), (2) Minimum Endtoend Delay with Contiguous Node Reuse (MEDCNR), (3) Minimum Endtoend Delay with Arbitrary Node Reuse (MEDANR), (4) Maximum Frame Rate with No Node Reuse or Share (MFRNNRS), (5) Maximum Frame Rate with Contiguous Node Reuse and Share (MFRCNRS), and (6) Maximum Frame Rate with Arbitrary Node Reuse and Share (MFRANRS). Here, 'contiguous node reuse' means that multiple contiguous modules along the pipeline may run on the same node and 'arbitrary node reuse' imposes no restriction on node reuse. Note that in interactive applications, a node can be reused but its resource is not shared. We prove that MEDANR is polynomially solvable and the rest are NP-complete. MEDANR, where either contiguous or noncontiguous modules in the pipeline can be mapped onto the same node, is essentially the Maximum n-hop Shortest Path problem, and can be solved using a dynamic programming method. In MEDNNR and MFRNNRS, any network node can be used only once, which requires selecting the same number of nodes for onetoone onto mapping. We show its NP-completeness by reducing from the Hamiltonian Path problem. Node reuse is allowed in MEDCNR, MFRCNRS and MFRANRS, which are similar to the Maximum n-hop Shortest Path problem that considers resource sharing. We prove their NP-completeness by reducing from the Disjoint-Connecting-Path Problem and Widest path with the Linear Capacity Constraints problem, respectively.

Lin, Ying [University of Tennessee, Knoxville (UTK); Wu, Qishi [ORNL; Zhu, Mengxia [ORNL; Rao, Nageswara S [ORNL

2009-04-01

290

1 Abstract-- Distribution factors play a key role in many system security analysis and market of the other distribution factors. The line outage distribution factors (LODFs) may be computed using the ISFs distribution factors, line outage distribution factors, multiple-line outages, system security. I. INTRODUCTION

291

The analysis of biodiversity using rank abundance distributions.

Biodiversity is an important topic of ecological research. A common form of data collected to investigate patterns of biodiversity is the number of individuals of each species at a series of locations. These data contain information on the number of individuals (abundance), the number of species (richness), and the relative proportion of each species within the sampled assemblage (evenness). If there are enough sampled locations across an environmental gradient then the data should contain information on how these three attributes of biodiversity change over gradients. We show that the rank abundance distribution (RAD) representation of the data provides a convenient method for quantifying these three attributes constituting biodiversity. We present a statistical framework for modeling RADs and allow their multivariate distribution to vary according to environmental gradients. The method relies on three models: a negative binomial model, a truncated negative binomial model, and a novel model based on a modified Dirichlet-multinomial that allows for a particular type of heterogeneity observed in RAD data. The method is motivated by, and applied to, a large-scale marine survey off the coast of Western Australia, Australia. It provides a rich description of biodiversity and how it changes with environmental conditions. PMID:19432789

Foster, Scott D; Dunstan, Piers K

2010-03-01

292

Analysis of an algorithm for distributed recognition and accountability

Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approach for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.

Ko, C.; Frincke, D.A.; Goan, T. Jr.; Heberlein, L.T.; Levitt, K.; Mukherjee, B.; Wee, C. [California Univ., Davis, CA (United States). Dept. of Computer Science

1993-08-01

293

VISUALIZATION AND ANALYSIS OF LPS DISTRIBUTION IN BINARY PHOSPHOLIPID BILAYERS

Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram negative bacteria during infections. It have been reported that LPS may play a rol in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or Cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4°C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery. PMID:19324006

Florencia, Henning María; Susana, Sanchez; Laura, Bakás

2010-01-01

294

Heterogeneity effects in radiation dose distribution analysis for BNCT

Calculation of the various radiation dose components that will exist in the treatment volume during boron neutron capture therapy (BNCT) is a complex, three-dimensional problem. These components all have different spatial distributions and relative biological effectiveness (RBE). The typical approach to such calculations has been to approximate the highly heterogeneous calculational geometry of the irradiation volume by either a spatially homogenized model or by a simplified few-region model. The accuracy of such models should be validated by comparison with calculated results obtained by modeling the actual heterogeneous geometry and tissue variations as faithfully as possible. The results of such an exercise for the geometry of the canine head are presented. There are basically three types of tissue-heterogeneity effects that influence radiation dose distributions in BNCT. First, macroscopic spatial fluence perturbations in BNCT. First, macroscopic spatial fluence perturbations can occur as a result of spatial variations in the radiation transport properties of the tissues in the irradiation volume. Second, tissues with different elemental compositions will have different kerma factors and, therefore, different absorbed doses, even in the same radiation fluence. Finally, there are macroscopic and microscopic effects caused by local secondary charged-particle imbalance in heterogeneous media. The work presented in this paper concentrates on the first type of heterogeneity effects.

Moran, J.M.; Nigg, D.W. (Idaho National Engineering Lab., Idaho Falls (United States))

1992-01-01

295

Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina)] [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States)] [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina) [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

2009-05-22

296

Freight distribution systems with cross-docking: a multidisciplinary analysis

companies have changed their logistics strategies for better adapting them to the changing demand. Moreover framework for logistics and transport pooling systems, as well as a simulation method for strategic planning-docking: a multidisciplinary analysis 1. Introduction The freight transport industry is a major source of employment

297

Real Time Web Usage Mining with a Distributed Navigation Analysis

The behaviour of a Web site's users may change so quickly that attempting to make predictions, according to the frequent patterns coming from the analysis of an access log file, becomes challenging. In order for the obsolescence of the behavioural patterns to become as null as possible, the ideal method would provide frequent patterns in real time, allowing the result

Florent Masseglia; Maguelonne Teisseire; Pascal Poncelet

2002-01-01

298

NASA Technical Reports Server (NTRS)

The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

Schmeckpeper, K. R.

1987-01-01

299

An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

NASA Technical Reports Server (NTRS)

The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

Carter, M. C.; Madison, M. W.

1973-01-01

300

An analysis of the Seasat Satellite Data Distribution System

NASA Technical Reports Server (NTRS)

A computerized data distribution network for remote accessing of Seasat generated data is described. The service is intended as an experiment to determine user needs and operational abilities for utilizing on-line satellite generated oceanographic data. Synoptic weather observations are input to the U.S. Fleet Numerical Oceanographic Central for preparation and transfer to a PDP 11/60 central computer, from which all access trunks originate. The data available includes meteorological and sea-state information in the form of analyses and forecasts, and users are being monitored for reactions to the system design, data products, system operation, and performance evaluation. The system provides data on sea level and upper atmospheric pressure, sea surface temperature, wind magnitude and direction, significant wave heights, direction, and periods, and spectral wave data. Transmissions have a maximum rate of 1.1 kbit/sec over the telephone line.

Ferrari, A. J.; Renfrow, J. T.

1980-01-01

301

A pair distribution function analysis of zeolite beta

We describe the structural refinement of zeolite beta using the local structure obtained with the pair distribution function (PDF) method. A high quality synchrotron and two neutron scattering datasets were obtained on two samples of siliceous zeolite beta. The two polytypes that make up zeolite beta have the same local structure; therefore refinement of the two structures was possible using the same experimental PDF. Optimized structures of polytypes A and B were used to refine the structures using the program PDFfit. Refinements using only the synchrotron or the neutron datasets gave results inconsistent with each other but a cyclic refinement with the two datasets gave a good fit to both PDFs. The results show that the PDF method is a viable technique to analyze the local structure of disordered zeolites. However, given the complexity of most zeolite frameworks, the use of both X-ray and neutron radiation and high-resolution patterns is essential to obtain reliable refinements.

Martinez-Inesta, M.M.; Peral, I.; Proffen, T.; Lobo, R.F. (Delaware); (LANL)

2010-07-20

302

Quantitative analysis of inclusion distributions in hot pressed silicon carbide

ABSTRACT Depth of penetration measurements in hot pressed SiC have exhibited significant variability that may be influenced by microstructural defects. To obtain a better understanding regarding the role of microstructural defects under highly dynamic conditions; fragments of hot pressed SiC plates subjected to impact tests were examined. Two types of inclusion defects were identified, carbonaceous and an aluminum-iron-oxide phase. A disproportionate number of large inclusions were found on the rubble, indicating that the inclusion defects were a part of the fragmentation process. Distribution functions were plotted to compare the inclusion populations. Fragments from the superior performing sample had an inclusion population consisting of more numerous but smaller inclusions. One possible explanation for this result is that the superior sample withstood a greater stress before failure, causing a greater number of smaller inclusions to participate in fragmentation than in the weaker sample.

Michael Paul Bakas

2012-12-01

303

Distributional Impacts of Agricultural Growth in Pakistan: A Multiplier Analysis

In spite of substantial growth in agricultural GDP in the 1990s, rural poverty rates in Pakistan did not decline. This paper explores the reasons for this lack of correlation between increases in agricultural production and poverty reduction through an analysis of growth linkages using a 2001-02 Social Accounting Matrix (SAM)-based semi-inputoutput model. Model simulations indicate that expansion of traditional crop

Paul Dorosh; Muhammad Khan Niazi; Hina Nazli

2003-01-01

304

A social network analysis of customer-level revenue distribution

Social network analysis has been a topic of regular interest in the marketing discipline. Previous studies have largely focused\\u000a on similarities in product\\/brand choice decisions within the same social network, often in the context of product innovation\\u000a adoption. Not much is known, however, about the importance of social network effects once customers have been acquired. Using\\u000a the customer base of

Michael Haenlein

2011-01-01

305

e're closer than you think: Portland's geographic location and Oregon's transportation infra - structure offer unmatched con - nectivity and time savings to international and domestic markets. Our economic development practices combine project-ready property with efficient, high-capacity infrastructure to create today's logistics advantages. Connecting people, places and products is the core of Portland's distribution and logistics industry sec - tor.

F. Gregory; B. Boyd; R. Bridges; D. Mitchell; J. Halsell; S. Fancher; D. King; R. Fore; E. Mango; D. Berlinrut; M. Leinbach; M. Maier; M. Wetmore; H. Herring; J. Guidi; M. Coolidge; J. Heald; T. Knox; D. Bartine; R. Bailey; H. Delgado; P. Conant; J. Madura; R. Thomas; F. Merceret; G. Allen; E. Bensman; R. Dittemore; N. Feldman; C. Boykin; H. Tileston; F. Brody; L. Hagerman; S. Pearson; L. Uccellini; W. Vaughan; J. Golden; D. Johnson; J. McQueen; B. Roberts; L. Freeman; G. Jasper; B. Hagemeyer; A. McCool; X. W. Proenza; S. Glover

2006-01-01

306

Sensitivity Analysis of Distributed Soil Moisture Profiles by Active Distributed Temperature Sensing

NASA Astrophysics Data System (ADS)

Monitoring and measuring the fluctuations of soil moisture at large scales in the filed remains a challenge. Although sensors based on measurement of dielectric properties such as Time Domain Reflectometers (TDR) and capacity-based probes can guarantee reasonable responses, they always operate on limited spatial ranges. On the other hand optical fibers, attached to a Distribute Temperature Sensing (DTS) system, can allow for high precision soil temperature measurements over distances of kilometers. A recently developed technique called Active DTS (ADTS) and consisting of a heat pulse of a certain duration and power along the metal sheath covering the optical fiber buried in the soil, has proven a promising alternative to spatially-limited probes. Two approaches have been investigated to infer distributed soil moisture profiles in the region surrounding the optic fiber cable by analyzing the temperature variations during the heating and the cooling phases. One directly relates the change of temperature to the soil moisture (independently measured) to develop specific calibration curve for the soil used; the other requires inferring the thermal properties and then obtaining the soil moisture by inversion of known relationships. To test and compare the two approaches over a broad range of saturation conditions a large lysimeter has been homogeneously filled with loamy soil and 52 meters of fiber optic cable have been buried in the shallower 0.8 meters in a double coil rigid structure of 15 loops along with a series of capacity-based sensors (calibrated for the soil used) to provide independent soil moisture measurements at the same depths of the optical fiber. Thermocouples have also been wrapped around the fiber to investigate the effects of the insulating cover surrounding the cable, and in between each layer in order to monitor heat diffusion at several centimeters. A high performance DTS has been used to measure the temperature along the fiber optic cable. Several soil moisture profiles have been generated in the lysimeter either varying the water table height or by wetting the soil from the top. The sensitivity of the ADTS method for heat pulses of different duration and power and ranges of spatial and temporal resolution are presented.

Ciocca, F.; Van De Giesen, N.; Assouline, S.; Huwald, H.; Lunati, I.

2012-12-01

307

Analysis of Drying Kinetics and Moisture Distribution in Convective Textile Fabric Drying

The drying process of crude cotton fabric is analyzed under two main aspects: analysis of moisture distribution inside the textile sheet, and analysis of certain operational convective drying process variables. Experimental apparatus consisted of a drying chamber in which samples of pure cotton textile were suspended inside the drying chamber and exposed to a convective hot air flow. The influence

Luiza Helena C. D. Sousa; Oswaldo. C. Motta Lima; Nehemias C. Pereira

2006-01-01

308

1 Composition and On Demand Deployment of Distributed Brain Activity Analysis Application on Global are brain science and high-energy physics. The analysis of brain activity data gathered from the MEG and analyze brain functions and requires access to large-scale computational resources. The potential platform

Abramson, David

309

Global QCD Analysis, the Gluon Distribution, and High $E_t$ Inclusive Jet Data

We report on an extensive global QCD analysis of new DIS and hadronic inclusive jet production data emphasizing the impact of these recent data on the determination of the gluon distribution, and on the interpretation of the high $E_t$ jets highlighted by the CDF collaboration. This analysis results in (i) a better handle on the range of uncertainty of the

Wu-Ki Tung

1996-01-01

310

Evolution History of Asteroid Itokawa Based on Block Distribution Analysis

NASA Astrophysics Data System (ADS)

This work investigates trends in the global and regional distribution of blocks on asteroid 25143 Itokawa in order to discover new findings to better understand the history of this asteroid. Itokawa is a near-Earth object, and the first asteroid that was targeted for a sample return mission. Trends in block population provide new insights in regards to Itokawa's current appearance following the disruption of a possible parent body, and how its surface might have changed since then. Here blocks are defined as rocks or features with distinctive positive relief that are larger than a few meters in size. The size and distribution of blocks are measured by mapping the outline of the blocks using the Small Body Mapping Tool (SBMT) created by the Johns Hopkins University Applied Physics Laboratory [1]. The SBMT allows the user to overlap correctly geo-located Hayabusa images [2] onto the Itokawa shape model. This study provides additional inferences on the original disruption and subsequent re-accretion of Itokawa's "head" and "body" from block analyses. A new approach is taken by analyzing the population of blocks with respect to latitude for both Itokawa's current state, and a hypothetical elliptical body. Itokawa currently rotates approximately about its maximum moment of inertia, which is expected due to conservation of momentum and minimum energy arguments. After the possible disruption of the parent body of Itokawa, the "body" of Itokawa would have tended to a similar rotation. The shape of this body is made by removing the head of Itokawa and applying a semispherical cap. Using the method of [3] inertial properties of this object are calculated. With the assumption that this object had settled to its stable rotational axis, it is found that the pole axis could have been tilted about 13° away from the current axis in the direction opposite the head, equivalent to a 33 meter change in the center of mass. The results of this study provide means to test the hypothesis of whether or not Itokawa is a contact binary. References: [1] E. G. Kahn, et al. A tool for the visualization of small body data. In LPSC XLII, 2011. [2] A. Fujiwara, et al. The rubble-pile asteroid Itokawa as observed by Hayabusa. Science, 312(5778):1330-1334, June 2006. [3] A. F. Cheng, et al. Small-scale topography of 433 Eros from laser altimetry and imaging. Icarus, 155(1):51-74, 2002

Mazrouei, Sara; Daly, Michael; Barnouin, Olivier; Ernst, Carolyn

2013-04-01

311

Numerical analysis of atomic density distribution in arc driven negative ion sources

The purpose of this study is to calculate atomic (H{sup 0}) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H{sup 0} density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H{sup 0} production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H{sup 0} production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H{sup 0} production and the ionization are enhanced. The calculated H{sup 0} density distribution without the effect of the H{sup 0} transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H{sup 0} distribution.

Yamamoto, T., E-mail: t.yamamoto@ppl.appi.keio.ac.jp; Shibata, T.; Hatayama, A. [Graduate School of Science and Technology, Keio University, 3-14-1 Hiyoshi, Yokohama 223-8522 (Japan)] [Graduate School of Science and Technology, Keio University, 3-14-1 Hiyoshi, Yokohama 223-8522 (Japan); Kashiwagi, M.; Hanada, M. [Japan Atomic Energy Agency (JAEA), 801-1 Mukouyama, Naka 311-0193 (Japan)] [Japan Atomic Energy Agency (JAEA), 801-1 Mukouyama, Naka 311-0193 (Japan); Sawada, K. [Faculty of Engineering, Shinshu University, 4-17-1 Wakasato, Nagano 380-8553 (Japan)] [Faculty of Engineering, Shinshu University, 4-17-1 Wakasato, Nagano 380-8553 (Japan)

2014-02-15

312

Motion synthesis and force distribution analysis for a biped robot.

In this paper, the method of generating biped robot motion using recorded human gait is presented. The recorded data were modified taking into account the velocity available for robot drives. Data includes only selected joint angles, therefore the missing values were obtained considering the dynamic postural stability of the robot, which means obtaining an adequate motion trajectory of the so-called Zero Moment Point (ZMT). Also, the method of determining the ground reaction forces' distribution during the biped robot's dynamic stable walk is described. The method was developed by the authors. Following the description of equations characterizing the dynamics of robot's motion, the values of the components of ground reaction forces were symbolically determined as well as the coordinates of the points of robot's feet contact with the ground. The theoretical considerations have been supported by computer simulation and animation of the robot's motion. This was done using Matlab/Simulink package and Simulink 3D Animation Toolbox, and it has proved the proposed method. PMID:21761810

Trojnacki, Maciej T; Zieli?ska, Teresa

2011-01-01

313

Correlation Spectroscopy of Minor Species: Signal Purification and Distribution Analysis

We are performing experiments that use fluorescence resonance energy transfer (FRET) and fluorescence correlation spectroscopy (FCS) to monitor the movement of an individual donor-labeled sliding clamp protein molecule along acceptor-labeled DNA. In addition to the FRET signal sought from the sliding clamp-DNA complexes, the detection channel for FRET contains undesirable signal from free sliding clamp and free DNA. When multiple fluorescent species contribute to a correlation signal, it is difficult or impossible to distinguish between contributions from individual species. As a remedy, we introduce ''purified FCS'' (PFCS), which uses single molecule burst analysis to select a species of interest and extract the correlation signal for further analysis. We show that by expanding the correlation region around a burst, the correlated signal is retained and the functional forms of FCS fitting equations remain valid. We demonstrate the use of PFCS in experiments with DNA sliding clamps. We also introduce ''single molecule FCS'', which obtains diffusion time estimates for each burst using expanded correlation regions. By monitoring the detachment of weakly-bound 30-mer DNA oligomers from a single-stranded DNA plasmid, we show that single molecule FCS can distinguish between bursts from species that differ by a factor of 5 in diffusion constant.

Laurence, T A; Kwon, Y; Yin, E; Hollars, C; Camarero, J A; Barsky, D

2006-06-21

314

Magnetoencephalography (MEG), which acquires neuromagnetic fields in the brain, is a useful diagnostic tool in presurgical evaluation of epilepsy. Previous studies have shown that MEG affects the planning intracranial electroencephalography placement and correlates with surgical outcomes by using a single dipole model. Spatiotemporal source analysis using distributed source models is an advanced method for analyzing MEG, and has been recently introduced for analyzing epileptic spikes. It has advantages over the conventional single dipole analysis for obtaining accurate sources and understanding the propagation of epileptic spikes. In this article, we review the source analysis methods, describe the techniques of the distributed source analysis, interpretation of source distribution maps, and discuss the benefits and feasibility of this method in evaluation of epilepsy. PMID:24574999

Tanaka, Naoaki; Stufflebeam, Steven M.

2014-01-01

315

NASA Technical Reports Server (NTRS)

A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

Gyekenyesi, J. P.

1985-01-01

316

Analysis of spatial distribution in tropospheric temperature trends

NASA Astrophysics Data System (ADS)

Regional patterns in tropospheric and sea surface temperature (SST) trends are examined for the period 1979-2001 using MSU, NCEP-NCAR, ECMWF reanalyses, NOAA OI SST, and the CARDS radiosonde data set. Trends are estimated using a nonparametric approach. Substantial regional variability in temperature trends is seen in all data sets, with the magnitude of the variability (including substantial regions with cooling trends) far exceeding the average warming trend. The global analyses from MSU and reanalyses are used to identify sampling problems in using radiosonde network to infer global trends. Analysis of tropospheric temperature trends concurrent with trends in SST shows regions where the signs disagree for both surface cooling and warming. Interpretation of these differing trends using the reanalyses suggest that the models used for the reanalyses are simulating the necessary dynamics/thermodynamics that could lead to a tropospheric cooling in contrast to a surface warming (and vice versa).

Agudelo, Paula A.; Curry, Judith A.

2004-11-01

317

Differentiating cerebral lymphomas and GBMs featuring luminance distribution analysis

NASA Astrophysics Data System (ADS)

Differentiating lymphomas and glioblastoma multiformes (GBMs) is important for proper treatment planning. A number of works have been proposed but there are still some problems. For example, many works depend on thresholding a single feature value, which is susceptible to noise. Non-typical cases that do not get along with such simple thresholding can be found easily. In other cases, experienced observers are required to extract the feature values or to provide some interactions to the system, which is costly. Even if experts are involved, inter-observer variance becomes another problem. In addition, most of the works use only one or a few slice(s) because 3D tumor segmentation is difficult and time-consuming. In this paper, we propose a tumor classification system that analyzes the luminance distribution of the whole tumor region. The 3D MRIs are segmented within a few tens of seconds by using our fast 3D segmentation algorithm. Then, the luminance histogram of the whole tumor region is generated. The typical cases are classified by the histogram range thresholding and the apparent diffusion coefficients (ADC) thresholding. The non-typical cases are learned and classified by a support vector machine (SVM). Most of the processing elements are semi-automatic except for the ADC value extraction. Therefore, even novice users can use the system easily and get almost the same results as experts. The experiments were conducted using 40 MRI datasets (20 lymphomas and 20 GBMs) with non-typical cases. The classification accuracy of the proposed method was 91.1% without the ADC thresholding and 95.4% with the ADC thresholding. On the other hand, the baseline method, the conventional ADC thresholding, yielded only 67.5% accuracy.

Yamasaki, Toshihiko; Chen, Tsuhan; Hirai, Toshinori; Murakami, Ryuji

2013-02-01

318

Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log normal distribution function (J Nucl Med 47, 6 (2006) 1049-1058) with the aid of an autoradiographic approach. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analyses of these data. Methods The measured distributions of alpha particle tracks per cell were subjected to statistical tests with Poisson (P), log normal (LN), and Poisson – log normal (P – LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL 210Po-citrate. When cells were exposed to 67 kBq/mL, the P – LN distribution function gave a better fit, however, the underlying activity distribution remained log normal. Conclusions The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:16741316

Neti, Prasad V.S.V.; Howell, Roger W.

2008-01-01

319

NASA Technical Reports Server (NTRS)

This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

Gurgiolo, Chris; Vinas, Adolfo F.

2009-01-01

320

NASA Astrophysics Data System (ADS)

We have developed a multifractal approach to the analysis of size-frequency distributions of craters on planetary surfaces. We demonstrate the use of the method and study the relationship between the multifractal spectrum and the size-frequency distribution of craters. We showed that if the multifractal spectrum of a crater size distribution can be approximated by a parabolic function, the size-frequency distribution of craters are lognormal. For demonstration of our approach we analyzed distributions of craters on selected Phobos areas using Mars Express HRSC images. We demonstrated that the distributions of the craters are very well approximated by lognormal curves, as our technique suggests. Using the multifractal approach we show that size-frequency distributions of small craters on the sub-Mars and anti-Mars sides of Phobos' surface appear to be different. We suggested that this approach may be used for analysis of size-frequency distributions of craters on other planetary bodies. This research was funded by the Ministry of Education and Science of the Russian Federation (MEGA-GRANT, Project name: "Geodesy, cartography and the study of planets and satellites", contract No. 11.G34.31.0021).

Uchaev, Dm. V.; Malinnikov, V. A.; Uchaev, D. V.; Oberst, J.

2012-04-01

321

NASA Technical Reports Server (NTRS)

The strength distribution of fibers within a two-dimensional laminate ceramic/ceramic composite consisting of an eight harness satin weave of Nicalon continuous fiber within a chemically vapor infiltrated SiC matrix was determined from analysis of the fracture mirrors of the fibers. Comparison of the fiber strengths and the Weibull moduli with those for Nicalon fibers prior to incorporation into composites suggests that possible fiber damage may occur either during the weaving or during another stage of the composite manufacture. Observations also indicate that it is the higher-strength fibers which experience the greatest extent of fiber pullout and thus make a larger contribution to the overall composite toughness than do the weaker fibers.

Eckel, Andrew J.; Bradt, Richard C.

1989-01-01

322

Time-Score Analysis in Criterion-Referenced Tests. Final Report.

ERIC Educational Resources Information Center

The family of Weibull distributions was investigated as a model for the distributions of response times for items in computer-based criterion-referenced tests. The fit of these distributions were, with a few exceptions, good to excellent according to the Kolmogorov-Smirnov test. For a few relatively simple items, the two-parameter gamma…

Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.

323

Incidence, histopathologic analysis and distribution of tumours of the hand

Background The aim of this large collective and meticulous study of primary bone tumours and tumourous lesions of the hand was to enhance the knowledge about findings of traumatological radiographs and improve differential diagnosis. Methods This retrospective study reviewed data collected from 1976 until 2006 in our Bone Tumour Registry. The following data was documented: age, sex, radiological investigations, tumour location, histopathological features including type and dignity of the tumour, and diagnosis. Results The retrospective analysis yielded 631 patients with a mean age of 35.9?±?19.2 years. The majority of primary hand tumours were found in the phalanges (69.7%) followed by 24.7% in metacarpals and 5.6% in the carpals. Only 10.6% of all cases were malignant. The major lesion type was cartilage derived at 69.1%, followed by bone cysts 11.3% and osteogenic tumours 8.7%. The dominant tissue type found in phalanges and metacarpals was of cartilage origin. Osteogenic tumours were predominant in carpal bones. Enchondroma was the most commonly detected tumour in the hand (47.1%). Conclusions All primary skeletal tumours can be found in the hand and are most often of cartilage origin followed by bone cysts and osteogenic tumours. This study furthermore raises awareness about uncommon or rare tumours and helps clinicians to establish proper differential diagnosis, as the majority of detected tumours of the hand are asymptomatic and accidental findings on radiographs. PMID:24885007

2014-01-01

324

Biomechanical Analysis of Force Distribution in Human Finger Extensor Mechanisms

The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the “Principle of Minimum Total Potential Energy” is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576

Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

2014-01-01

325

Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

NASA Astrophysics Data System (ADS)

(note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

Singh, R.; Percivall, G.

2009-12-01

326

The sensitivity of fracture location distribution in brittle materials

NASA Technical Reports Server (NTRS)

The Weibull weak-link theory allows for the computation of distribution functions for both the fracture location and the applied far-field stress. Several authors have suggested using the fracture location information from tests to infer Weibull parameters, and others have used the predictive capabilities of the theory to calculate average fracture locations for brittle bodies. By a simple set of example calculations, it is shown that the fracture location distribution function is distinctly more sensitive to perturbations in the stress state than the fracture stress distribution function is. In general, the average fracture location is more subject to stress perturbations than the average fracture stress. The results indicate that care must be exercised in applying fracture location theory.

Wetherhold, Robert C.

1991-01-01

327

In this study, a logical, stepwise, efficient approach was used to develop and validate particle size distribution analysis methods for 58 different pharmaceutical bulk powders in a timely fashion. Image analysis was used to determine particle morphology and laser diffraction particle size distribution analysis was used to evaluate the dispersion medium, dispersion concentration, sonication time, and dispersion stability. Ruggedness validation was performed, by two different analysis, on different days, with different instruments on two preparations each of two different lots of material. It was determined that if the relative standard deviation (RSD) of the median volume diameters (d50) of the four preparations for each lot was below 20%, the method was suitably rugged for use in a quality control setting. Data for methyldopa, metoprolol tartrate, and metronidazole are presented as typical method validation results for three different modes of analysis. Data at three points (d10, d50, and d90) on the distributions were tabulated and evaluated for all 58 methods validated. The median volume diameter (d50) was found to be adequate for method validation. The approach rapidly generated valid, reproducible particle size distribution analysis methodology. PMID:9653752

Barber, D; Keuter, J; Kravig, K

1998-05-01

328

Design and analysis of a low-threshold polymer circular-grating distributed-feedback laser

NASA Astrophysics Data System (ADS)

A transfer-matrix technique is used to calculate the lasing thresholds of second-order circular-grating polymer lasers operating at 630 nm. By use of poly[2-methoxy-5-(2'-ethyl-hexyloxy)-p-phenylenevinylene] as an example polymer material, it is also shown how known optical properties of polymeric materials may be incorporated into the analysis of both the transverse waveguiding and the distributed feedback in circular-grating distributed-feedback polymer lasers.

Barlow, Guy F.; Shore, Alan; Turnbull, Graham A.; Samuel, Ifor D. W.

2004-12-01

329

We measured size distributions in model wetlands to detect stressor effects at the community level. Two experiments investigated the individual and combined effects of methyl mercury, chlorpyrifos, atrazine, monosodium methane arsonate, and UV-B light on the system. The statistical analysis of the metric using size distributions, which integrated information about organisms 0.2–4750 µm in size, detected effects in the planktonic

Robert M. Baca; Stephen T. Threlkeld

2000-01-01

330

of the requirement for the degree of MASTER OF SCIENCE August 1973 Major Subjects Agricultural Economics AN ECONOMIC ANALYSIS OF ALTERNATIVE SPRINKLER IRRIGATION DISTRIBUTION SYSTEMS ON THE SOUTHERN HIGH PLAINS OF TEXAS A Thesis JOHN CHRISTOPHER PEARCE.... This presents a dual problem in that producers in this area do not have sufficient information on which to base decisions regarding irrigation distribution system investments and the time required to manually evaluate alternative systems is prohibitive due...

Pearce, John Christopher

1973-01-01

331

Photon correlation spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS. -from Author

Rees, T.F.

1990-01-01

332

NASA Astrophysics Data System (ADS)

In the energy landscape picture, the dynamics of glasses and crystals is usually decomposed into two separate contributions: interbasin and intrabasin dynamics. The intrabasin dynamics depends partially on the quadratic displacement distribution on a given metabasin. Here we show that such a distribution can be approximated by a Gamma function, with a mean that depends linearly on the temperature and on the inverse second moment of the density of vibrational states. The width of the distribution also depends on this last quantity, and thus the contribution of the boson peak in glasses is evident on the tail of the distribution function. It causes the distribution of the mean-square displacement to decay slower in glasses than in crystals. When a statistical analysis is performed under many energy basins, we obtain a Gaussian in which the width is regulated by the mean inverse second moment of the density of states. Simulations performed in binary glasses are in agreement with such a result.

Flores-Ruiz, Hugo M.; Naumis, Gerardo G.

2012-04-01

333

Analysis of the 3D distribution of stacked self-assembled quantum dots by electron tomography

The 3D distribution of self-assembled stacked quantum dots (QDs) is a key parameter to obtain the highest performance in a variety of optoelectronic devices. In this work, we have measured this distribution in 3D using a combined procedure of needle-shaped specimen preparation and electron tomography. We show that conventional 2D measurements of the distribution of QDs are not reliable, and only 3D analysis allows an accurate correlation between the growth design and the structural characteristics. PMID:23249477

2012-01-01

334

Inclusive electron scattering from nuclei at low momentum transfer (corresponding to x>1) and moderate Q^2 is dominated by quasifree scattering from nucleons. In the impulse approximation, the cross section can be directly connected to the nucleon momentum distribution via the scaling function F(y). The breakdown of the y-scaling assumptions in certain kinematic regions have prevented extraction of nucleon momentum distributions from such a scaling analysis. With a slight modification to the y-scaling assumptions, it is found that scaling functions can be extracted which are consistent with the expectations for the nucleon momentum distributions.

J. Arrington

2003-06-13

335

powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions

Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

2014-01-01

336

Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

NASA Astrophysics Data System (ADS)

We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

2012-07-01

337

Laser diffraction (LD) and static image analysis (SIA) of rectangular particles [United States Pharmacopeia, USP30-NF25, General Chapter , Optical Miroscopy.] have been systematically studied. To rule out sample dispersion and particle orientation as the root cause of differences in size distribution profiles, we immobilize powder samples on a glass plate by means of a dry disperser. For a defined region

A. P. Tinke; A. Carnicer; R. Govoreanu; G. Scheltjens; L. Lauwerysen; N. Mertens; K. Vanhoutte; M. E. Brewster

2008-01-01

338

Computer image analysis of wear debris with its applications to machine wear diagnosis

NASA Astrophysics Data System (ADS)

The ferrographic technique is a quasi-quantitative method which is widely used in the wear diagnosis of machines. In this paper, an improved ferrographic technique that makes use of a microcomputer image processing system and image analysis techniques was used to quantitatively realize the parameters of wear debris. The debris was sampled under several wear conditions. The Weibull function was applied to fit the actual distribution of size and roundness. It was shown that the transition region of the wear states can be predicted by a test of the mean and variance of the debris size distribution. Additionally, it was shown that the wear mode in certain mechanisms can be diagnosed by the identification and the classification of debris according to its roundness, lenth ratio of major to minor, concavity, dispersion, linearity, etc.

Zuo, Hong-Fu; Zhan, Jing-Lan; Li, Bao-Chen; Tao, Gui-Chun

339

Consideration of tip speed limitations in preliminary analysis of minimum COE wind turbines

NASA Astrophysics Data System (ADS)

A relation between Cost Of Energy, COE, maximum allowed tip speed, and rated wind speed, is obtained for wind turbines with a given goal rated power. The wind regime is characterised by the corresponding parameters of the probability density function of wind speed. The non-dimensional characteristics of the rotor: number of blades, the blade radial distributions of local solidity, twist, angle, and airfoil type, play the role of parameters in the mentioned relation. The COE is estimated using a cost model commonly used by the designers. This cost model requires basic design data such as the rotor radius and the ratio between the hub height and the rotor radius. Certain design options, DO, related to the technology of the power plant, tower and blades are also required as inputs. The function obtained for the COE can be explored to find those values of rotor radius that give rise to minimum cost of energy for a given wind regime as the tip speed limitation changes. The analysis reveals that iso-COE lines evolve parallel to iso-radius lines for large values of limit tip speed but that this is not the case for small values of the tip speed limits. It is concluded that., as the tip speed limit decreases, the optimum decision for keeping minimum COE values can be: a) reducing the rotor radius for places with high weibull scale parameter or b) increasing the rotor radius for places with low weibull scale parameter.

Cuerva-Tejero, A.; Yeow, T. S.; Lopez-Garcia, O.; Gallego-Castillo, C.

2014-12-01

340

Sensitivity analysis of CLIMEX parameters in modelling potential distribution of Lantana camara L.

A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling. PMID:22815881

Taylor, Subhashni; Kumar, Lalit

2012-01-01

341

Sensitivity Analysis of CLIMEX Parameters in Modelling Potential Distribution of Lantana camara L.

A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling. PMID:22815881

Taylor, Subhashni; Kumar, Lalit

2012-01-01

342

Using CLIMEX and the Taguchi Method, a process-based niche model was developed to estimate potential distributions of Phoenix dactylifera L. (date palm), an economically important crop in many counties. Development of the model was based on both its native and invasive distribution and validation was carried out in terms of its extensive distribution in Iran. To identify model parameters having greatest influence on distribution of date palm, a sensitivity analysis was carried out. Changes in suitability were established by mapping of regions where the estimated distribution changed with parameter alterations. This facilitated the assessment of certain areas in Iran where parameter modifications impacted the most, particularly in relation to suitable and highly suitable locations. Parameter sensitivities were also evaluated by the calculation of area changes within the suitable and highly suitable categories. The low temperature limit (DV2), high temperature limit (DV3), upper optimal temperature (SM2) and high soil moisture limit (SM3) had the greatest impact on sensitivity, while other parameters showed relatively less sensitivity or were insensitive to change. For an accurate fit in species distribution models, highly sensitive parameters require more extensive research and data collection methods. Results of this study demonstrate a more cost effective method for developing date palm distribution models, an integral element in species management, and may prove useful for streamlining requirements for data collection in potential distribution modeling for other species as well. PMID:24722140

Shabani, Farzin; Kumar, Lalit

2014-01-01

343

State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

Doris, E.; Krasko, V.A.

2012-10-01

344

The analysis of reaction time (RT) distributions has become a recognized standard in studies on the stimulus response correspondence (SRC) effect as it allows exploring how this effect changes as a function of response speed. In this study, we compared the spatial SRC effect (the classic Simon effect) with the motion SRC effect using RT distribution analysis. Four experiments were conducted, in which we manipulated factors of space position and motion for stimulus and response, in order to obtain a clear distinction between positional SRC and motion SRC. Results showed that these two types of SRC effects differ in their RT distribution functions as the space positional SRC effect showed a decreasing function, while the motion SRC showed an increasing function. This suggests that different types of codes underlie these two SRC effects. Potential mechanisms and processes are discussed. PMID:24605178

Styrkowiec, Piotr; Szczepanowski, Remigiusz

2013-01-01

345

Cartilage tissue engineering can provide a valuable tool for controlled studies of tissue development. As an example, analysis of the spatial distribution of glycosaminoglycans (GAG) in sections of cartilaginous tissues engineered under different culture conditions could be used to correlate the effects of environmental factors with the structure of the regenerated tissue. In this paper we describe a computer-based technique

Ivan Martin; Bojana Obradovic; Lisa E. Freed; Gordana Vunjak-Novakovic

1999-01-01

346

sharing its resources to a point where its ability to service its own clients is unsatisfactory. 1 of servers to closely meet client demand for the content objects it hosts. However, selecting the bestA Queuing Analysis of Server Sharing Collectives for Content Distribution Daniel Villela Dan

347

Nonsmooth analysis and sonar-based implementation of distributed coordination algorithms

Nonsmooth analysis and sonar-based implementation of distributed coordination algorithms Craig L mobile robots equipped with sonar. We develop novel approaches for improving single point sonar scan) equipped with rotating sonar sensors is used. Successful implementation of the algorithms is largely

Brennan, Sean

348

MAINE GAP ANALYSIS VERTEBRATE DATA -PART II: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF

MAINE GAP ANALYSIS VERTEBRATE DATA - PART II: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF BREEDING BIRDS IN MAINE Randall B. Boonea Department of Wildlife Ecology and Maine Cooperative Fish and Wildlife Research Unit University of Maine, Orono, ME 04469-5755 and William B. Krohn USGS Biological

Boone, Randall B.

349

MAINE GAP ANALYSIS VERTEBRATE DATA -PART I: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF

MAINE GAP ANALYSIS VERTEBRATE DATA - PART I: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF AMPHIBIANS, REPTILES AND MAMMALS IN MAINE Randall B. Boonea Department of Wildlife Ecology and Maine Cooperative Fish and Wildlife Research Unit University of Maine, Orono, ME 04469-5755 and William B. Krohn

Boone, Randall B.

350

and increase the reliability of orientation solutions. The misorientation distribution function shows a strong, 2004!. With auto- mated serial sectioning using focused ion beam milling, one can obtain three, thereby making it possible to study polycrystalline materials not possible by SEM-EBSD analysis ~Dingley

Ferreira, Paulo J.

351

Distributed Modeling and Economic Analysis of Erosion in GIS for Watershed Restoration Jan Boll1, Washington State University, Pullman, Washington 4 USDA-ARS, Pullman, Washington 5 USDA-NRCS, Boise, Idaho 83844-0904 Keywords: integrated systems approach, hydrology, spatial variability, water quality, crop

Walter, M.Todd

352

The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values

165 The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have

Omiecinski, Curtis

353

Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks

Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks Xingyan Li and Lynne E. Parker Proc. of IEEE International Conference on Robotics and Automation, Kobe, Japan multi-robot team tasks. While the centralized version of SAFDetection was shown to be successful

Parker, Lynne E.

354

Particle size distribution (PSD) analysis of the Gregory mine coal rejects, over a period of time is presented. The PSD has important implications on several processes contributing to the acid mine drainage. In this paper, correlation between the particle breakdown in the coal rejects and the rate of oxidation of sulphides is presented.Difference between the initial and the subsequent values

Sheila Devasahayam

2007-01-01

355

Quo vadis efficiency analysis of water distribution? A comparative literature review

Recognizing the growing importance of scientific benchmarking in water distribution, we provide a comprehensive survey of the available literature. We begin with a discussion about the (limited) use of benchmarking in the regulation of UK water utilities, and then extend the analysis to regulated water sectors in other countries. We find no clear impact of public or private ownership; instead,

Matthias Walter; Astrid Cullmann; Christian von Hirschhausen; Robert Wand; Michael Zschille

2009-01-01

356

The distribution of knowledge (by scientists) and data sources (advanced scientific instruments), and the need of large-scale computational resources for analyzing massive scientific data are two major problems commonly observed in scientific disciplines. The two popular scientific disciplines of this nature are brain science and high-energy physics. The analysis of brain activity data gathered from the MEG (Magnetoencephalography) instrument is

R. Buyya; S. Date; Y. Mizuno-Matsumoto; S. Venugopal; D. Abramson

357

Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory

A systematic analysis of transverse momentum distribution of hadrons produced in ultrarelativistic p+p and A+A collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.

Sena, I.; Deppman, A. [Instituto de Fisica - Universidade de Sao Paulo (Brazil)

2013-03-25

358

Overlap distributions and taxonomy analysis of spin glass states with equal weights

499 Overlap distributions and taxonomy analysis of spin glass states with equal weights N. Parga) RÃ©sumÃ©. 2014 Nous utilisons des techniques de taxonomie numÃ©rique pour vÃ©rifier l'ultramÃ©tricitÃ© des entre Ã©chantillons disparaissent. Abstract. 2014 Techniques of numerical taxonomy are used to make

Paris-Sud XI, UniversitÃ© de

359

Technology Transfer Automated Retrieval System (TEKTRAN)

Starch was isolated from flour of four wheats representing hard red winter (Karl), hard red spring (Gunner), durum (Belfield 3), and spelt (WK 86035-8) wheat classes. Digital image analysis (IA) coupled to light microscopy was used to determine starch size distributions where the volume of granules...

360

Analysis and minimization of losses in electric-power transmission and distribution systems

A comprehensive approach is taken to the loss analysis and minimization in electrical power systems at both transmission and distribution levels. New methods were developed to calculate the power and energy losses in bulk-power transmission systems. The methods incorporate the effect of the actual on-line control and operation of the system to improve the accuracy of calculations. To calculate the

Baran

1988-01-01

361

Does a powder surface contain all necessary information for particle size distribution analysis?

The aim of this study was to utilise a new approach where digital image information is used in the characterisation of particle size distributions of a large set of pharmaceutical powders. A novel optical set-up was employed to create images and calculate a stereometric parameter from the digital images of powder surfaces. Analysis was made of 40 granule batches with

Niklas Laitinen; Osmo Antikainen; Jouko Yliruusi

2002-01-01

362

Measurement and analysis of residence time distribution (RTD) data is the focus of this study where data collection methods were developed specifically for the pretreatment reactor environment. Augmented physical sampling and automated online detection methods were developed and applied. Both the measurement techniques themselves and the produced RTD data are presented and discussed.

Sievers, D.; Kuhn, E.; Tucker, M.; Stickel, J.; Wolfrum, E.

2013-06-01

363

This paper considers the robust stability analysis of cellular neural networks with discrete and distributed delays. Based on the Lyapunov stability theory and linear matrix inequality (LMI) technique, a novel stability criterion guaranteeing the global robust convergence of the equilibrium point is derived. The criterion can be solved easily by various convex optimization algorithms. An example is given to illustrate

Ju H. Park

2007-01-01

364

GenoMap: A Distributed System for Unifying Genotyping and Genetic Linkage Analysis

to be managed include: informative sets of polymorphic markers; databases of patient demographic, pedigreeGenoMap: A Distributed System for Unifying Genotyping and Genetic Linkage Analysis Todd E. Scheetz. of Electrical and Computer Engineering and the Dept. of Genetics University of Iowa genomap

Casavant, Tom

365

Qualitative Analysis of Distributed Physical Systems with Applications to Control Synthesis \\Lambda

Qualitative Analysis of Distributed Physical Systems with Applications to Control Synthesis \\Lambda fields. Analyzing and controlling these physical processes and systems are common tasks in many and controllability. This paper develops an ontological abstraction and a structureÂbased design mechanism

Bailey-Kellogg, Chris

366

Considering carcinogenesis as a microevolutionary process, best described in the context of metapopulation dynamics, provides the basis for theoretical and empirical studies that indicate it is possible to estimate the relative contribution of genetic instability and selection to the process of tumor formation. We show that mutational load distribution analysis (MLDA) of DNA found in pancreatic fluids yields biometrics that

Gemma Tarafa; David Tuck; Daniela Ladner; Mark Topazian; Randall Brand; Carolyn Deters; Victor Moreno; Gabriel Capella; Henry Lynch; Paul Lizardi; Jose Costa

2008-01-01

367

Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis

Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis Senbo Xiao, Wolfram of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk-strain relationships of four different models, from spider and Bombyx mori silk peptides, in antiparallel and parallel

GrÃ¤ter, Frauke

368

Distributional analysis of regional benefits and cost of air quality control

The methodology and results of an analysis of benefits and costs of air quality control for an urban region in Florida are given. The machinery used considers the spatial distribution of: (a) emission sources; (b) the ambient levels resulting from local meteorological conditions and geographic features; and (c) the socioeconomic characteristics of the impacted population groups. This facilitates an examination

E. T. Loehman; S. V. Berg; A. A. Arroyo; R. A. Hedinger; J. M. Schwartz; M. E. Shaw; R. W. Fahien; V. H. De; R. P. Fishe; D. E. Rio

1979-01-01

369

Generalized Analysis of a Distributed Energy Efficient Algorithm for Change Detection

Generalized Analysis of a Distributed Energy Efficient Algorithm for Change Detection Taposh. of ECE, Indian Institute of Science Bangalore, India vinod@ece.iisc.ernet.in ABSTRACT An energy efficientÂ29, 2009, Tenerife, Canary Islands, Spain Copyright 2009 ACM 978-1-60558-616-9/09/10 ...$10.00. model finds

Sharma, Vinod

370

An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

ERIC Educational Resources Information Center

Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

Attali, Yigal

2010-01-01

371

Monte Carlo simulation and theoretical modeling are used to study the statistical failure modes in unidirectional composites consisting of elastic bers in an elastic matrix. Both linear and hexagonal ber arrays are considered, forming 2D and 3D composites, respectively. Failure is idealized using the chain-of-bundles model in terms of -bundles of length , which is the length-scale of ber load

SIVASAMBU MAHESH; S. LEIGH PHOENIX; IRENE J. BEYERLEIN

2002-01-01

372

NASA Astrophysics Data System (ADS)

We consider a constitutive model for polycrystalline ice, which contains delayed-elastic and viscous deformations, and a damage variable. The damage variable is coupled to the delayed-elastic deformation by a fiber bundle ansatz. We construct an isotropic theory, which can be calibrated with experimental data. Furthermore, we generalize the theory to a damage model in terms of rank-four tensors. This general model allows the evolution of anisotropic damage.

Keller, Arne; Hutter, Kolumban

2014-04-01

373

NASA Astrophysics Data System (ADS)

Tests and calibration of sprayers have been considered a very important task for chemicals use reduction in agriculture and for improvement of plant phytosanitary protection. A reliable, affordable and easy-to-use method to observe the distribution in the field is required and the infrared thermoimage analysis can be considered as a potential method based on non-contact imaging technologies. The basic idea is that the application of colder water (10 °C less) than the leaves surface makes it possible to distinguish and measure the targeted areas by means of a infrared thermoimage analysis based on significant and time persistent thermal differences. Trials were carried out on a hedge of Prunus laurocerasus, 2.1 m height with an homogenous canopy. A trailed orchard sprayer was employed with different spraying configurations. A FLIR TM (S40) thermocamera was used to acquire (@ 50 Hz) thermal videos, in a fixed position, at frame rate of 10 images/s, for nearly 3 min. Distribution quality was compared to the temperature differences obtained from the thermal images between pre-treatment and post-treatment (?T)., according two analysis: time-trend of ?T average values for different hedge heights and imaging ?T distribution and area coverage by segmentation in k means clustering after 30 s of spraying. The chosen spraying configuration presented a quite good distribution for the entire hedge height with the exclusion of the lower (0-1 m from the ground) and the upper part (>1.9 m). Through the image segmentation performed of ?T image by k-means clustering, it was possible to have a more detailed and visual appreciation of the distribution quality among the entire hedge. The thermoimage analysis revealed interesting potentiality to evaluate quality distribution from orchards sprayers.

Menesatti, P.; Biocca, M.

2007-09-01

374

This paper explores the possibility of using commercial software for thermoluminescence glow curve deconvolution (GCD) analysis. The program PEAKFIT has been used to perform GCD analysis of complex glow curves of quartz and dosimetric materials. First-order TL peaks were represented successfully using the Weibull distribution function. Second-order and general-order TL peaks were represented accurately by using the Logistic asymmetric functions with varying symmetry parameters. Analytical expressions were derived for determining the energy E from the parameters of the Logistic asymmetric functions. The accuracy of these analytical expressions for E was tested for a wide variety of kinetic parameters and was found to be comparable to the commonly used expressions in the TL literature. The effectiveness of fit of the analytical functions used here was tested using the figure of merit (FOM) and was found to be comparable to the accuracy of recently published GCD expressions for first- and general-order kinetics. PMID:12382713

Pagonis, V; Kitis, G

2002-01-01

375

Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline.

The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications, documentation and usage are available online (http://Pipeline.loni.ucla.edu). PMID:19649168

Dinov, Ivo D; Van Horn, John D; Lozev, Kamen M; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; Mackenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S; Toga, Arthur W

2009-01-01

376

Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions.

Gaite, José, E-mail: jose.gaite@upm.es [Instituto de Microgravedad IDR, EIAE, Universidad Politécnica de Madrid, Pza. Cardenal Cisneros 3, E-28040 Madrid (Spain)

2010-03-01

377

Spatial Latent Class Analysis Model for Spatially Distributed Multivariate Binary Data

A spatial latent class analysis model that extends the classic latent class analysis model by adding spatial structure to the latent class distribution through the use of the multinomial probit model is introduced. Linear combinations of independent Gaussian spatial processes are used to develop multivariate spatial processes that are underlying the categorical latent classes. This allows the latent class membership to be correlated across spatially distributed sites and it allows correlation between the probabilities of particular types of classes at any one site. The number of latent classes is assumed fixed but is chosen by model comparison via cross-validation. An application of the spatial latent class analysis model is shown using soil pollution samples where 8 heavy metals were measured to be above or below government pollution limits across a 25 square kilometer region. Estimation is performed within a Bayesian framework using MCMC and is implemented using the OpenBUGS software. PMID:20161235

Wall, Melanie M.; Liu, Xuan

2009-01-01

378

NASA Astrophysics Data System (ADS)

As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

Taravat, A.; Del Frate, F.

2013-09-01

379

Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model

NASA Astrophysics Data System (ADS)

As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly growing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis (SA) method for snow cover fraction input data (SCF) for a distributed rainfall-runoff model to investigate if the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focused on the relation between the SCF sensitivity and the physical, spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland for which a distributed WetSpa model is setup to simulate two years of daily runoff. The SA uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which uses different response functions for each 4 km × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as: geomorphology, soil texture, land-use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for the spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model.

Berezowski, T.; Nossent, J.; Chorma?ski, J.; Batelaan, O.

2014-10-01

380

Can Data Recognize Its Parent Distribution?

This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

A.W.Marshall; J.C.Meza; and I. Olkin

1999-05-01

381

Pair distribution function analysis of La(Fe1-xRux)AsO compounds

NASA Astrophysics Data System (ADS)

The local structures of La(Fe1-xRux)AsO (0.00?x?0.80) compounds were investigated by means of pair distribution function analysis at room temperature; as a result, no phase separation or clustering takes place. Local distortions are no longer correlated beyond ~15 Å for both pure and substituted samples, indicating that the presence of Ru atoms does not determine a notable variation in the length scale of the local distortion. Different types of short range correlation between Fe and Ru atoms do not produce significant changes in the pair distribution function.

Martinelli, A.; Palenzona, A.; Ferdeghini, C.; Mazzani, M.; Bonfa`, P.; Allodi, G.

2014-12-01

382

The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

Mirzabozorg, H.; Hariri-Ardebili, M. A.; Shirkhan, M.; Seyed-Kolbadi, S. M.

2014-01-01

383

The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

Mirzabozorg, H; Hariri-Ardebili, M A; Shirkhan, M; Seyed-Kolbadi, S M

2014-01-01

384

Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

Weitzel, E.; Hoeschele, M.

2014-09-01

385

Numerical Analysis of Temperature Distribution in Friction Welding of Carbon Steel

NASA Astrophysics Data System (ADS)

The purpose of this study is to estimate temperature distribution in the vicinity of weld interface during a friction welding process involving an upset process. On the base of a simple model of friction heat input, a non-steady heat conduction analysis was carried out by finite element method. As a result from a comparison of the estimated temperature distribution with the experimental data, it turned out that the friction heat input model that allowed for the effects of temperature and linear velocity on the friction coefficient was appropriate. This heat input model could simulate adequately the change in friction heat input and temperature distribution in a friction welding process. As a result, the relationship between burn-off length and temperature distribution in upset process has been explained and also the relationship between temperature distribution and width of heat-affected zone has been obtained. This heat input model allows us to estimate temperature distribution in friction welding, from friction pressure, rotation speed and the thermal property of base metal, even where a friction-welding machine does not have a function of torque measurement.

Isshiki, Yoshihiro; Yamaguchi, Hiroshi; Kawai, Gosaku; Ogawa, Koichi

386

MIMO RADAR PERFORMANCE ANALYSIS UNDER K-DISTRIBUTED CLUTTER Xin Zhang, Mohammed Nabil El Korso radar context, i.e., the minimum angular separation required to resolve two closely-spaced targets. Due-distributed clutter, performance analysis, CramÂ´er-Rao bound, resolution limit, MIMO radar. 1. INTRODUCTION Multiple

387

NASA Astrophysics Data System (ADS)

Reasonable prediction of landslide occurrences in a given area requires the choice of an appropriate probability distribution of recurrence time intervals. Although landslides are widespread and frequent in many parts of the world, complete databases of landslide occurrences over large periods are missing and often such natural disasters are treated as processes uncorrelated in time and, therefore, Poisson distributed. In this paper, we examine the recurrence time statistics of landslide events simulated by a cellular automaton model that reproduces well the actual frequency-size statistics of landslide catalogues. The complex time series are analysed by varying both the threshold above which the time between events is recorded and the values of the key model parameters. The synthetic recurrence time probability distribution is shown to be strongly dependent on the rate at which instability is approached, providing a smooth crossover from a power-law regime to a Weibull regime. Moreover, a Fano factor analysis shows a clear indication of different degrees of correlation in landslide time series. Such a finding supports, at least in part, a recent analysis performed for the first time of an historical landslide time series over a time window of fifty years.

Piegari, E.; Di Maio, R.; Avella, A.

2013-12-01

388

Validation results of the IAG Dancer project for distributed GPS analysis

NASA Astrophysics Data System (ADS)

The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and demonstrates the obvious benefits of distributed analysis of geodetic data in general. IAG Dancer screenshot

Boomkamp, H.

2012-12-01

389

Category induction via distributional analysis: Evidence from a serial reaction time task

Category formation lies at the heart of a number of higher-order behaviors, including language. We assessed the ability of human adults to learn, from distributional information alone, categories embedded in a sequence of input stimuli using a serial reaction time task. Artificial grammars generated corpora of input strings containing a predetermined and constrained set of sequential statistics. After training, learners were presented with novel input strings, some of which contained violations of the category membership defined by distributional context. Category induction was assessed by comparing performance on novel and familiar strings. Results indicate that learners develop increasing sensitivity to the category structure present in the input, and become sensitive to fine-grained differences in the pre- and post-element contexts that define category membership. Results suggest that distributional analysis plays a significant role in the development of visuomotor categories, and may play a similar role in the induction of linguistic form-class categories. PMID:20177430

Hunt, Ruskin H.; Aslin, Richard N.

2009-01-01

390

NASA Astrophysics Data System (ADS)

Distributed innovation processes are considered as a new option to handle both the complexity and the speed in which new products and services need to be prepared. Indeed most research on innovation processes was focused on multinational companies with an intra-organisational perspective. The phenomena of innovation processes in networks - with an inter-organisational perspective - have been almost neglected. Collaborative networks present a perfect playground for such distributed innovation processes whereas the authors highlight in specific Virtual Organisation because of their dynamic behaviour. Research activities supporting distributed innovation processes in VO are rather new so that little knowledge about the management of such research is available. With the presentation of the collaborative network relationship analysis this gap will be addressed. It will be shown that a qualitative planning of collaboration intensities can support real business cases by proving knowledge and planning data.

Eschenbächer, Jens; Seifert, Marcus; Thoben, Klaus-Dieter

391

In this work we present an improved approach for the analysis of (1)H double-quantum nuclear magnetic resonance build-up data, mainly for the determination of residual dipolar coupling constants and distributions thereof in polymer gels and elastomers, yielding information on crosslink density and potential spatial inhomogeneities. We introduce a new generic build-up function, for use as component fitting function in linear superpositions, or as kernel function in fast Tikhonov regularization (ftikreg). As opposed to the previously used inverted Gaussian build-up function based on a second-moment approximation, this method yields faithful coupling constant distributions, as limitations on the fitting limit are now lifted. A robust method for the proper estimation of the error parameter used for the regularization is established, and the approach is demonstrated for different inhomogeneous elastomers with coupling constant distributions. PMID:21280798

Chassé, Walter; López Valentín, Juan; Genesky, Geoffrey D; Cohen, Claude; Saalwächter, Kay

2011-01-28

392

Precise dipolar coupling constant distribution analysis in proton multiple-quantum NMR of elastomers

NASA Astrophysics Data System (ADS)

In this work we present an improved approach for the analysis of 1H double-quantum nuclear magnetic resonance build-up data, mainly for the determination of residual dipolar coupling constants and distributions thereof in polymer gels and elastomers, yielding information on crosslink density and potential spatial inhomogeneities. We introduce a new generic build-up function, for use as component fitting function in linear superpositions, or as kernel function in fast Tikhonov regularization (ftikreg). As opposed to the previously used inverted Gaussian build-up function based on a second-moment approximation, this method yields faithful coupling constant distributions, as limitations on the fitting limit are now lifted. A robust method for the proper estimation of the error parameter used for the regularization is established, and the approach is demonstrated for different inhomogeneous elastomers with coupling constant distributions.

Chassé, Walter; Valentín, Juan López; Genesky, Geoffrey D.; Cohen, Claude; Saalwächter, Kay

2011-01-01

393

NASA Astrophysics Data System (ADS)

One component of clinical treatment validation, for example in the commissioning of new radiotherapy techniques or in patient specific quality assurance, is the evaluation and verification of planned and delivered dose distributions. Gamma and related tests (such as the chi evaluation) have become standard clinical tools for such work. Both functions provide quantitative comparisons between dose distributions, combining dose difference and distance to agreement criteria. However, there are some practical considerations in their utilization that can compromise the integrity of the tests, and these are occasionally overlooked especially when the tests are too readily adopted from commercial software. In this paper we review the evaluation tools and describe some practical concerns. The intent is to provide users with some guidance so that their use of these evaluations will provide valid rapid analysis and visualization of the agreement between planned and delivered dose distributions.

Schreiner, L. J.; Holmes, O.; Salomons, G.

2013-06-01

394

Once a homogeneous ensemble of a protein ligand is taken from solution and immobilized to a surface, for many reasons the resulting ensemble of surface binding sites may be heterogeneous. For example, this can be due to the intrinsic surface roughness causing variations in the local microenvironment, non-uniform density distribution of polymeric linkers, or non-uniform chemical attachment producing different protein orientations and conformations. We have previously described a computational method for determining the distribution of affinity and rate constants of surface sites from the analysis of experimental surface binding data. It fully exploits the high signal/noise ratio and reproducibility provided by optical biosensor technology, such as surface plasmon resonance. Since the computational analysis is ill-conditioned, the previous approach used a regularization strategy assuming a priori all binding parameters to be equally likely, resulting in the broadest possible parameter distribution consistent with the experimental data. We have now extended this method in a Bayesian approach to incorporate the opposite assumption, i.e. that the surface sites a priori are expected to be uniform (as one would expect in free solution). This results in a distribution of binding parameters as close to monodispersity as possible, given the experimental data. Using several model protein systems immobilized on a carboxymethyl dextran surface and probed with surface plasmon resonance, we show micro-heterogeneity of the surface sites, in addition to broad populations of significantly altered affinity. The distributions obtained are highly reproducible. Immobilization conditions and the total surface density of immobilized sites can have a substantial impact on the functional distribution of the binding sites. PMID:18816013

Gorshkova, Inna I.; Svitel, Juraj; Razjouyan, Faezeh; Schuck, Peter

2008-01-01

395

Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants

This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.

Simion, G.P. [Science Applications International Corp., Albuquerque, NM (United States); VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Bulmahn, K.D. [SCIENTECH, Inc., Idaho Falls, ID (United States)

1993-06-01

396

Impact of hadronic and nuclear corrections on global analysis of spin-dependent parton distributions

We present the first results of a new global next-to-leading order analysis of spin-dependent parton distribution functions from the most recent world data on inclusive polarized deep-inelastic scattering, focusing in particular on the large-x and low-Q^2 regions. By directly fitting polarization asymmetries we eliminate biases introduced by using polarized structure function data extracted under nonuniform assumptions for the unpolarized structure functions. For analysis of the large-x data we implement nuclear smearing corrections for deuterium and 3He nuclei, and systematically include target mass and higher twist corrections to the g_1 and g_2 structure functions at low Q^2. We also explore the effects of Q^2 and W^2 cuts in the data sets, and the potential impact of future data on the behavior of the spin-dependent parton distributions at large x.

Jimenez-Delgado, Pedro [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Accardi, Alberto [Hampton University, Hampton, VA (United States); Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Melnitchouk, Wally [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

2014-02-01

397

NASA Astrophysics Data System (ADS)

Probes made of carbon fibre composite NB41 were exposed to deuterium plasmas in the TEXTOR tokamak and in a simulator of plasma-wall interactions, PISCES. The aim was to assess the deuterium retention and its lateral and depth distribution. The analysis was performed by means of D( 3He, p) 4He and 12C( 3He, p) 14N nuclear reactions analysis using a standard (1 mm spot) and micro-beam (20 ?m resolution). The measurements have revealed non uniform distribution of deuterium atoms in micro-regions: differences by a factor of 3 between the maximum and minimum deuterium concentrations. The differences were associated with the orientation and type of fibres for samples exposed in PICSES. For surface structure in the erosion zone of samples exposed to a tokamak plasma the micro-regions were more complex. Depth profiling has indicated migration of fuel into the bulk of materials.

Petersson, P.; Kreter, A.; Possnert, G.; Rubel, M.

2010-06-01

398

Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

Quinlan, D; Barany, G; Panas, T

2007-08-30

399

The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

2008-01-15

400

Multiple Stress Effect Analysis on Pneumatic Cylinders Accelerated Life Testing

Accelerated life testing is a valuable tool to get information quickly on lifetime distribution which is achieved by subjecting the test units to conditions that are more severe than the normal ones. This paper firstly describes a model for multiple stress-type accelerated life data which is based on the widely known log-linear model and is formulated with Weibull model for

Chen Juan; Wang Deyi; Wu Qiang; Wang Zhanlin

2009-01-01

401

Sensitivity analysis for large-deflection and postbuckling responses on distributed-memory computers

NASA Technical Reports Server (NTRS)

A computational strategy is presented for calculating sensitivity coefficients for the nonlinear large-deflection and postbuckling responses of laminated composite structures on distributed-memory parallel computers. The strategy is applicable to any message-passing distributed computational environment. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a parallel sparse equation solver based on a nested dissection (or multilevel substructuring) node ordering scheme; and (3) a multilevel parallel procedure for evaluating hierarchical sensitivity coefficients. The hierarchical sensitivity coefficients measure the sensitivity of the composite structure response to variations in three sets of interrelated parameters; namely, laminate, layer and micromechanical (fiber, matrix, and interface/interphase) parameters. The effectiveness of the strategy is assessed by performing hierarchical sensitivity analysis for the large-deflection and postbuckling responses of stiffened composite panels with cutouts on three distributed-memory computers. The panels are subjected to combined mechanical and thermal loads. The numerical studies presented demonstrate the advantages of the reduced basis technique for hierarchical sensitivity analysis on distributed-memory machines.

Watson, Brian C.; Noor, Ahmed K.

1995-01-01

402

Some Physics And System Issues In The Security Analysis Of Quantum Key Distribution Protocols

In this paper we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks which are not accounted for in the security analysis and proofs. Hence the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

Horace P. Yuen

2014-05-07

403

Some physics and system issues in the security analysis of quantum key distribution protocols

NASA Astrophysics Data System (ADS)

In this paper, we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks, which are not accounted for in the security analysis and proofs. Hence, the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

Yuen, Horace P.

2014-10-01

404

Single-phase power distribution system power flow and fault analysis

NASA Technical Reports Server (NTRS)

Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

Halpin, S. M.; Grigsby, L. L.

1992-01-01

405

Measuring protein interactions is key to understanding cell signaling mechanisms, but quantitative analysis of these interactions in situ has remained a major challenge. Here, we present spatial intensity distribution analysis (SpIDA), an analysis technique for image data obtained using standard fluorescence microscopy. SpIDA directly measures fluorescent macromolecule densities and oligomerization states sampled within single images. The method is based on fitting intensity histograms calculated from images to obtain density maps of fluorescent molecules and their quantal brightness. Because spatial distributions are acquired by imaging, SpIDA can be applied to the analysis of images of chemically fixed tissue as well as live cells. However, the technique does not rely on spatial correlations, freeing it from biases caused by subcellular compartmentalization and heterogeneity within tissue samples. Analysis of computer-based simulations and immunocytochemically stained GABAB receptors in spinal cord samples shows that the approach yields accurate measurements over a broader range of densities than established procedures. SpIDA is applicable to sampling within small areas (6 ?m2) and reveals the presence of monomers and dimers with single-dye labeling. Finally, using GFP-tagged receptor subunits, we show that SpIDA can resolve dynamic changes in receptor oligomerization in live cells. The advantages and greater versatility of SpIDA over current techniques open the door to quantificative studies of protein interactions in native tissue using standard fluorescence microscopy. PMID:21482753

Godin, Antoine G.; Costantino, Santiago; Lorenzo, Louis-Etienne; Swift, Jody L.; Sergeev, Mikhail; Ribeiro-da-Silva, Alfredo; De Koninck, Yves; Wiseman, Paul W.

2011-01-01

406

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Valgerdur Steinthorsdottir; Lu Qi; Elizabeth K. Speliotes; Gudmar Thorleifsson; Cristen J. Willer; Blanca M. Herrera; Anne U. Jackson; Noha Lim; Paul Scheet; Nicole Soranzo; Najaf Amin; Yurii S. Aulchenko; John C. Chambers; Alexander Drong; Jianan Luan; Helen N. Lyon; Fernando Rivadeneira; Serena Sanna; Nicholas J. Timpson; M. Carola Zillikens; Jing Hua Zhao; Peter Almgren; Stefania Bandinelli; Amanda J. Bennett; Richard N. Bergman; Lori L. Bonnycastle; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Peter Chines; Lachlan Coin; Cyrus Cooper; Gabriel Crawford; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Shah Ebrahim; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; Candace Guiducci; David Hadley; Anders Hamsten; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Fredrik Karpe; Johanna Kuusisto; Jaana Laitinen; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Narisu Narisu; Anna Nordström; Peter Nordström; Ben A. Oostra; Colin N. A. Palmer; Felicity Payne; John F. Peden; Inga Prokopenko; Frida Renström; Aimo Ruokonen; Veikko Salomaa; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Kaisa Silander; Kijoung Song; Xin Yuan; Heather M. Stringham; Amy J. Swift; Tiinamaija Tuomi; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield; Francis S. Collins; George Davey Smith; Ian N. M. Day; Paul W. Franks; Andrew T. Hattersley; Frank B. Hu; Marjo-Riitta Jarvelin; Augustine Kong; Jaspal S. Kooner; Markku Laakso; Edward Lakatta; Vincent Mooser; Andrew D. Morris; Leena Peltonen; Nilesh J. Samani; Timothy D. Spector; David P. Strachan; Toshiko Tanaka; Jaakko Tuomilehto; André G. Uitterlinden; Cornelia M. van Duijn; Nicholas J. Wareham; Hugh Watkins for the PROCARDIS consortia; Dawn M. Waterworth; Michael Boehnke; Panos Deloukas; Leif Groop; David J. Hunter; Unnur Thorsteinsdottir; David Schlessinger; H.-Erich Wichmann; Timothy M. Frayling; Gonçalo R. Abecasis; Joel N. Hirschhorn; Ruth J. F. Loos; Kari Stefansson; Karen L. Mohlke; Inês Barroso

2009-01-01

407

Distributed strain sensing with millimeter-order spatial resolution is demonstrated in optical fibers based on Brillouin optical correlation domain analysis. A novel beat lock-in detection scheme is introduced to suppress background noises coming from the reflection of Brillouin pump waves. The Brillouin frequency shifts of 3 mm fiber sections are successfully measured with a theoretical spatial resolution of 1.6 mm.

Kwang Yong Song; Zuyuan He; Kazuo Hotate

2006-01-01

408

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Cyrus Cooper; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; David Hadley; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Johanna Kuusisto; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Anna Nordstrom; Peter Nordstrom; Ben A. Oostra; Colin N. A. Palmer; John F. Peden; Inga Prokopenko; Frida Renstrom; Aimo Ruokonen; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Heather M. Stringham; Amy J. Swift; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield

2009-01-01

409

Application of digital image analysis for size distribution measurements of microbubbles

This work employs digital image analysis to measure the size distribution of microbubbles generated by the process of electroflotation for use in solid\\/liquid separation processes. Microbubbles are used for separations in the mineral processing industry and also in the treatment of potable water and wastewater.As the bubbles move upward in a solid\\/liquid column due to buoyancy, particles collide with and

S. E. Burns; S. Yiacoumi; D. Frost; C. Tsouris

1997-01-01

410

Computerized analysis of the transmural distribution of myocardial echo-contrast effect

The authors describe a method for the automatic analysis of the transmural distribution of the myocardial echo contrast effect. In anesthetized open-chest dogs the contrast agent SHU-454 was bolus-injected into the aortic root during short-axis 2-D echo, both at baseline and during coronary stenosis. End-diastolic echo images were digitized offline. In the first image of the sequence, the left-ventricular (LV)

Ezio Maria Ferdeghini; Daniele Rovai; Massimo Lombardi; Antonio Benassi; A. L'Abbate

1988-01-01

411

A numerical analysis of transient heat pipe performance including nonconventional heat pipes with nonuniform heat distributions is presented. A body-fitted grid system was applied to a three-dimensional wall and wick model, which was coupled with a transient compressible quasi-one-dimensional vapor flow model. The numerical results were first compared with experimental data from cylindrical heat pipes with good agreement. Numerical calculations

Y. Cao; A. Faghri

1991-01-01

412

Economical Analysis of the Cold Air Distribution System: A Case Study

?economical analysis 1?INTRODUCTION Recently, in the commercial buildings, the ice storage system has obviously helped the building owner reduce the expense of energy consuming. Besides, it will save the operating cost more by taking full advantage... of the chilled water available with ice storage to reduce the temperature of the air supply. The technology of cold air distribution appeared firstly in 1947, at that time it was limited in some special occasions. In 1980s, people found there were some...

Zhou, Z.; Xu, W.; Li, J.; Zhao, J.; Niu, L.

2006-01-01

413

Polarized parton distributions from NLO QCD analysis of world DIS and SIDIS data

The combined analysis of polarized DIS and SIDIS data is performed in NLO QCD. The new parametrization on polarized PDFs is constructed. The uncertainties on PDFs and their first moments are estimated applying the modified Hessian method. The especial attention is paid to the impact of novel SIDIS data on the polarized distributions of light sea and strange quarks. In particular, the important question of polarized sea symmetry is studied in comparison with the latest results on this subject.

A. Sissakian; O. Shevchenko; O. Ivanov

2009-08-23

414

We characterized the relationship between the genetic diversity of indigenous soybean-nodulating bradyrhizobia from weakly acidic soils in Japan and their geographical distribution in an ecological study of indigenous soybean rhizobia. We isolated bradyrhizobia from three kinds of Rj-genotype soybeans. Their genetic diversity and community structure were analyzed by PCR-RFLP analysis of the 16S-23S rRNA gene internal transcribed spacer (ITS) region with 11 Bradyrhizobium USDA strains as references. We used data from the present study and previous studies to carry out mathematical ecological analyses, multidimensional scaling analysis with the Bray-Curtis index, polar ordination analysis, and multiple regression analyses to characterize the relationship between soybean-nodulating bradyrhizobial community structures and their geographical distribution. The mathematical ecological approaches used in this study demonstrated the presence of ecological niches and suggested the geographical distribution of soybean-nodulating bradyrhizobia to be a function of latitude and the related climate, with clusters in the order Bj123, Bj110, Bj6, and Be76 from north to south in Japan. PMID:24240318

Saeki, Yuichi; Shiro, Sokichi; Tajima, Toshiyuki; Yamamoto, Akihiro; Sameshima-Saito, Reiko; Sato, Takashi; Yamakawa, Takeo

2013-01-01

415

NASA Astrophysics Data System (ADS)

Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.

Ying, Shen; Li, Lin; Gao, Yurong

2009-10-01

416

Space station electrical power distribution analysis using a load flow approach

NASA Technical Reports Server (NTRS)

The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

Emanuel, Ervin M.

1987-01-01

417

We characterized the relationship between the genetic diversity of indigenous soybean-nodulating bradyrhizobia from weakly acidic soils in Japan and their geographical distribution in an ecological study of indigenous soybean rhizobia. We isolated bradyrhizobia from three kinds of Rj-genotype soybeans. Their genetic diversity and community structure were analyzed by PCR-RFLP analysis of the 16S–23S rRNA gene internal transcribed spacer (ITS) region with 11 Bradyrhizobium USDA strains as references. We used data from the present study and previous studies to carry out mathematical ecological analyses, multidimensional scaling analysis with the Bray-Curtis index, polar ordination analysis, and multiple regression analyses to characterize the relationship between soybean-nodulating bradyrhizobial community structures and their geographical distribution. The mathematical ecological approaches used in this study demonstrated the presence of ecological niches and suggested the geographical distribution of soybean-nodulating bradyrhizobia to be a function of latitude and the related climate, with clusters in the order Bj123, Bj110, Bj6, and Be76 from north to south in Japan. PMID:24240318

Saeki, Yuichi; Shiro, Sokichi; Tajima, Toshiyuki; Yamamoto, Akihiro; Sameshima-Saito, Reiko; Sato, Takashi; Yamakawa, Takeo

2013-01-01

418

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

Wilhelmson, Robert B.

1994-01-01

419

Two-stage hierarchical modeling for analysis of subpopulations in conditional distributions

In this work, we develop modeling and estimation approach for the analysis of cross-sectional clustered data with multimodal conditional distributions where the main interest is in analysis of subpopulations. It is proposed to model such data in a hierarchical model with conditional distributions viewed as finite mixtures of normal components. With a large number of observations in the lowest level clusters, a two-stage estimation approach is used. In the first stage, the normal mixture parameters in each lowest level cluster are estimated using robust methods. Robust alternatives to the maximum likelihood estimation are used to provide stable results even for data with conditional distributions such that their components may not quite meet normality assumptions. Then the lowest level cluster-specific means and standard deviations are modeled in a mixed effects model in the second stage. A small simulation study was conducted to compare performance of finite normal mixture population parameter estimates based on robust and maximum likelihood estimation in stage 1. The proposed modeling approach is illustrated through the analysis of mice tendon fibril diameters data. Analyses results address genotype differences between corresponding components in the mixtures and demonstrate advantages of robust estimation in stage 1. PMID:22523443

Chervoneva, Inna; Zhan, Tingting; Iglewicz, Boris; Hauck, Walter W.; Birk, David E.

2011-01-01

420

Bayesian analysis of nanodosimetric ionisation distributions due to alpha particles and protons.

Track-nanodosimetry has the objective to investigate the stochastic aspect of ionisation events in particle tracks, by evaluating the probability distribution of the number of ionisations produced in a nanometric target volume positioned at distance d from a particle track. Such kind of measurements makes use of electron (or ion) gas detectors with detecting efficiencies non-uniformly distributed inside the target volume. This fact makes the reconstruction of true ionisation distributions, which correspond to an ideal efficiency of 100%, non-trivial. Bayesian unfolding has been applied to ionisation distributions produced by 5.4 MeV alpha particles and 20 MeV protons in cylindrical volumes of propane of 20 nm equivalent size, positioned at different impact parameters with respect to the primary beam. It will be shown that a Bayesian analysis performed by subdividing the target volume in sub-regions of different detection efficiencies is able to provide a good reconstruction of the true nanodosimetric ionisation distributions. PMID:21112893

De Nardo, L; Ferretti, A; Colautti, P; Grosswendt, B

2011-02-01

421

NASA Astrophysics Data System (ADS)

In this paper a procedure to derive Flood Design Hydrographs (FDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) using copulas, which describe and model the correlation between these two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model is presented. Rainfall-runoff modelling for estimating the hydrological response at the outlet of a watershed used a conceptual fully distributed procedure based on the soil conservation service - curve number method as excess rainfall model and a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the definition of a distributed unit hydrograph, has been performed, implementing a procedure using flow paths determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the return period of the FDH which give the probability of occurrence of a hydrograph flood peaks and flow volumes obtained through R-R modeling has been statistically treated via copulas. The shape of hydrograph has been generated on the basis of a modeled flood events, via cluster analysis. The procedure described above was applied to a case study of Imera catchment in Sicily, Italy. The methodology allows a reliable and estimation of the Design Flood Hydrograph and can be used for all the flood risk applications, i.e. evaluation, management, mitigation, etc.

Candela, A.; Brigandí, G.; Aronica, G. T.

2014-01-01

422

Nearest-neighbor analysis and the distribution of sinkholes: an introduction to spatial statistics

NSDL National Science Digital Library

This is an exercise I use in an upper-division geomorphology course to introduce students to nearest-neighbor analysis, a basic technique in spatial statistics. Nearest-neighbor analysis is a method of comparing the observed average distance between points and their nearest neighbor to the expected average nearest-neighbor distance in a random pattern of points. The pattern of points on a map or 2-D graph can be classified into three categories: CLUSTERED, RANDOM, REGULAR. Nearest-neighbor analysis provides an objective method for distinguishing among these possible spatial distributions. The technique also produces a population statistic, the nearest-neighbor index, which can be compared from area to area. In general, nearest-neighbor analysis can be applied to any geoscience phenomenon or feature whose spatial distribution can be categorized as a point pattern. The basic distance data can come from topographic maps, aerial photographs, or field measurements. The exercise presented here applies this technique to the study of karst landforms on topographic maps, specifically the spatial distribution of sinkholes. The advantages of introducing nearest-neighbor analysis in an undergraduate lab is that: (1) it reinforces important concepts related to data collection (e.g significant figures), map use (e.g. scale and the UTM grid), and basic statistics (e.g. hypothesis testing); (2) the necessary calculations are easily handled by most students; and (3) once learned, the technique can be widely applied in geoscience problem-solving. Designed for a geomorphology course Addresses student fear of quantitative aspect and/or inadequate quantitative skills

Rick Ford

423

Pulsed-field gradient nuclear magnetic resonance, previously used for measuring droplet size distributions in emulsions, has been used to measure bubble size distributions in a non-overflowing pneumatic gas-liquid foam that has been created by sparging propane into an aqueous solution of 1.5g/l (5.20mM) SDS. The bubble size distributions measured were reproducible and approximated a Weibull distribution. However, the bubble size distributions did not materially change with position at which they were measured within the froth. An analysis of foam coarsening due to Ostwald ripening in a non-overflowing foam indicates that, for the experimental conditions employed, one would not expect this to be a significant effect. It is therefore apparent that the eventual collapse of the foam is due to bubble bursting (or surface coalescence) rather than Ostwald ripening. This surface coalescence occurs because of evaporation from the free surface of the foam. An analytical solution for the liquid fraction profile for a certain class of non-overflowing pneumatic foam is given, and a mean bubble size that is appropriate for drainage calculations is suggested. PMID:20832808

Stevenson, Paul; Sederman, Andrew J; Mantle, Mick D; Li, Xueliang; Gladden, Lynn F

2010-12-01

424

Reliability analysis of uniaxially ground brittle materials

The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three-parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

Salem, J.A.; Nemeth, N.N. [National Aeronautics and Space Administration, Cleveland, OH (United States). Lewis Research Center; Powers, L.M.; Choi, S.R. [Cleveland State Univ., OH (United States). Dept. of Civil Engineering

1996-10-01

425

Reliability Analysis of Uniaxially Ground Brittle Materials

NASA Technical Reports Server (NTRS)

The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

1995-01-01

426

Reliability analysis of uniaxially ground brittle materials

NASA Astrophysics Data System (ADS)

The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

1995-02-01

427

This short communication determines the strength of two glass polyalkenoate cements that differ from each other through the composition of their glass phase. Sample sets of n=5, 10, 20 and 30 were formulated and tested in biaxial flexure. The derived mean for each sample set was compared against the Weibull characteristic strength. The mean and corresponding characteristic strength show a maximum percentage difference 10.1%, and the 95% confidence intervals calculated from the mean data encompass the corresponding characteristic strength down to a sample set of n=5. This suggests that, for brittle materials such as glass polyalkenoate cements, it is acceptable to test only five samples of each material in biaxial flexure and the resultant 95% confidence intervals will encompass the corresponding Weibull characteristic strength of the material. PMID:25553555

Mehrvar, Cina; Curran, Declan J; Alhalawani, Adel M F; Boyd, Daniel; Towler, Mark

2015-03-01

428

A FORTRAN program for multivariate survival analysis on the personal computer.

In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples. PMID:3180754

Mulder, P G

1988-01-01

429

NASA Astrophysics Data System (ADS)

have developed a new analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow intensity image data to study atmospheric gravity waves. This method can deal with extensive amounts of imaging data obtained on different years and at various observation sites without bias caused by different event extraction criteria for the person processing the data. The new method was applied to sodium airglow data obtained in 2011 at Syowa Station (69°S, 40°E), Antarctica. The results were compared with those obtained from a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics, such as wavelengths, phase velocities, and wave periods. The horizontal phase velocity of each wave event in the airglow images corresponded closely to a peak in the spectrum. The statistical results of spectral analysis showed an eastward offset of the horizontal phase velocity distribution. This could be interpreted as the existence of wave sources around the stratospheric eastward jet. Similar zonal anisotropy was also seen in the horizontal phase velocity distribution of the gravity waves by the event analysis. Both methods produce similar statistical results about directionality of atmospheric gravity waves. Galactic contamination of the spectrum was examined by calculating the apparent velocity of the stars and found to be limited for phase speeds lower than 30 m/s. In conclusion, our new method is suitable for deriving the horizontal phase velocity characteristics of atmospheric gravity waves from an extensive amount of imaging data.

Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Shiokawa, Kazuo

2014-08-01

430

The first criticality accident in Japan occurred in a uranium processing plant in Tokai-mura on September 30, 1999. The accident, which occurred while a large amount of enriched uranyl nitrate solution was being loaded into a tank, led to a chain reaction that continued for 20 h. Two workers who were pouring the uranium solution into the tank at the time were heterogeneously exposed to neutrons and gamma rays produced by nuclear fission. Analysis of dose distributions was essential for the understanding of the clinical course observed in the skin and organs of these workers. We developed a numerical simulation system, which consists of mathematical human models and Monte Carlo radiation transport programs, for analyzing dose distributions in various postures and applied the system to the dose analysis for the two workers. This analysis revealed the extreme heterogeneity of the doses from neutrons and gamma rays in the skin and body, which depended on the positions and postures of the workers. The detailed dose analysis presented here using color maps is indispensable for an understanding of the biological effects of high-dose exposure to a mixed field of neutrons and gamma rays as well as for the development of emergency treatments for victims of radiation exposure. PMID:12643798

Endo, Akira; Yamaguchi, Yasuhiro

2003-04-01

431

A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules

The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.

Stauch, Tim; Dreuw, Andreas, E-mail: dreuw@uni-heidelberg.de [Interdisciplinary Center for Scientific Computing, University of Heidelberg, Im Neuenheimer Feld 368, 69120 Heidelberg (Germany)] [Interdisciplinary Center for Scientific Computing, University of Heidelberg, Im Neuenheimer Feld 368, 69120 Heidelberg (Germany)

2014-04-07

432

NASA Astrophysics Data System (ADS)

The effects of specimen size on the compressive strength and Weibull modulus were investigated for nuclear graphite of different coke particle sizes: IG-110 and NBG-18 (average coke particle size for IG-110: 25 ?m, NBG-18: 300 ?m). Two types of cylindrical specimens, i.e., where the diameter to length ratio was 1:2 (ASTM C 695-91 type specimen, 1:2 specimen) or 1:1 (1:1 specimen), were prepared for six diameters (3, 4, 5, 10, 15, and 20 mm) and tested at room temperature (compressive strain rate: 2.08 × 10-4 s-1). Anisotropy was considered during specimen preparation for NBG-18. The results showed that the effects of specimen size appeared negligible for the compressive strength, but grade-dependent for the Weibull modulus. In view of specimen miniaturization, deviations from the ASTM C 695-91 specimen size requirements require an investigation into the effects of size for the grade of graphite of interest, and the specimen size effects should be considered for Weibull modulus determination.

Chi, Se-Hwan

2013-05-01

433

NASA Astrophysics Data System (ADS)

Distributed hydrological models enhance the analysis and explanation of environmental processes. As more spatial input data and time series become available, more analysis is required of the sensitivity of the data on the simulations. Most research so far focussed on the sensitivity of precipitation data in distributed hydrological models. However, these results can not be compared until a universal approach to quantify the sensitivity of a model to spatial data is available. The frequently tested and used remote sensing data for distributed models is snow cover. Snow cover fraction (SCF) remote sensing products are easily available from the internet, e.g. MODIS snow cover product MOD10A1 (daily snow cover fraction at 500m spatial resolution). In this work a spatial sensitivity analysis (SA) of remotely sensed SCF from MOD10A1 was conducted with the distributed WetSpa model. The aim is to investigate if the WetSpa model is differently subjected to SCF uncertainty in different areas of the model domain. The analysis was extended to look not only at SA quantities but also to relate them to the physical parameters and processes in the study area. The study area is the Biebrza River catchment, Poland, which is considered semi natural catchment and subject to a spring snow melt regime. Hydrological simulations are performed with the distributed WetSpa model, with a simulation period of 2 hydrological years. For the SA the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm is used, with a set of different response functions in regular 4 x 4 km grid. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different landscape features. Moreover, the spatial patterns of the SA results are related to the WetSpa spatial parameters and to different physical processes. Based on the study results, it is clear that spatial approach of SA can be performed with the proposed algorithm and the MOD10A1 SCF is spatially sensitive in the WetSpa model.

Berezowski, Tomasz; Chorma?ski, Jaros?aw; Nossent, Jiri; Batelaan, Okke

2014-05-01

434

The particle size distribution of foods during gastric digestion indicates the amount of physical breakdown that occurred due to the peristaltic movement of the stomach walls in addition to the breakdown that initially occurred during oral processing. The objective of this study was to present an image analysis technique that was rapid, simple, and could distinguish between food components (that is, rice kernel and bran layer in brown rice). The technique was used to quantify particle breakdown of brown and white rice during gastric digestion in growing pigs (used as a model for an adult human) over 480 min of digestion. The particle area distributions were fit to a Rosin-Rammler distribution function. Brown and white rice exhibited considerable breakdown as the number of particles per image decreased over time. The median particle area (x(50)) increased during digestion, suggesting a gastric sieving phenomenon, where small particles were emptied and larger particles were retained for additional breakdown. Brown rice breakdown was further quantified by an examination of the bran layer fragments and rice grain pieces. The percentage of total particle area composed of bran layer fragments was greater in the distal stomach than the proximal stomach in the first 120 min of digestion. The results of this study showed that image analysis may be used to quantify particle breakdown of a soft food product during gastric digestion, discriminate between different food components, and help to clarify the role of food structure and processing in food breakdown during gastric digestion. PMID:23923993

Bornhorst, Gail M; Kostlan, Kevin; Singh, R Paul

2013-09-01

435

Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

NASA Technical Reports Server (NTRS)

The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

Mcguire, Stephen C.

1987-01-01

436

To investigate whether treating cancer patients with erythropoiesis-stimulating agents (ESAs) would increase the mortality risk, Bennett et al. [Journal of the American Medical Association 299 (2008) 914–924] conducted a meta-analysis with the data from 52 phase III trials comparing ESAs with placebo or standard of care. With a standard parametric random effects modeling approach, the study concluded that ESA administration was significantly associated with increased average mortality risk. In this article we present a simple nonparametric inference procedure for the distribution of the random effects. We re-analyzed the ESA mortality data with the new method. Our results about the center of the random effects distribution were markedly different from those reported by Bennett et al. Moreover, our procedure, which estimates the distribution of the random effects, as opposed to just a simple population average, suggests that the ESA may be beneficial to mortality for approximately a quarter of the study populations. This new meta-analysis technique can be implemented with study-level summary statistics. In contrast to existing methods for parametric random effects models, the validity of our proposal does not require the number of studies involved to be large. From the results of an extensive numerical study, we find that the new procedure performs well even with moderate individual study sample sizes.

Wang, Rui; Tian, Lu; Cai, Tianxi; Wei, L. J.

2011-01-01

437

Application of digital image analysis for size distribution measurements of microbubbles

This work employs digital image analysis to measure the size distribution of microbubbles generated by the process of electroflotation for use in solid/liquid separation processes. Microbubbles are used for separations in the mineral processing industry and also in the treatment of potable water and wastewater.As the bubbles move upward in a solid/liquid column due to buoyancy, particles collide with and attach to the bubbles and are carried to the surface of the column where they are removed by skimming. The removal efficiency of solids is strongly affected by the size of the bubbles. In general, higher separation is achieved by a smaller bubble size. The primary focus of this study was to characterize the size and size distribution of bubbles generated in electroflotation using image analysis. The study found that bubble diameter increased slightly as the current density applied to the system was increased. Additionally, electroflotation produces a uniform bubble size with narrow distribution which optimizes the removal of fine particles from solution.

Burns, S.E.; Yiacoumi, S.; Frost, D. [Georgia Inst. of Tech., Atlanta, GA (United States). School of Civil and Environmental Engineering; Tsouris, C. [Oak Ridge National Lab., TN (United States). Chemical Technology Div.

1997-03-01

438

Analysis of crater distribution in mare units on the lunar far side

NASA Astrophysics Data System (ADS)

Mare material is asymmetrically distributed on the moon. The earth-facing hemisphere, where the crust is believed to be 26 km thinner than on the farside, contains substantially more basaltic mare material. Using Lunar Topographic Orthophoto Maps, the thickness of the mare material in three farside craters, Aitken (0.59 km), Isaev (1.0 km), and Tsiolkovskiy (1.75 km) was calculated. Crater frequency distribution in five farside mare units (Aitken, Isaev, Lacus Solitudinis, Langemak, and Tsiolkovskiy) and one light plains unit (in Mendeleev) were also studied. Nearly 10,000 farside craters were counted. Analysis of the crater frequency on the light plains unit gives an age of 4.3 billion yr. Crater frequency distributions on the mare units indicate ages of 3.7 and 3.8 billion yr. suggesting that the units are distributed over a narrow time period of approximately 100 million yr. Returned lunar samples from nearside maria give dates as young as 3.1 billion yr. The results of this study suggest that mare basalt emplacement on the far side ceased before it did on the near side.

Walker, A. S.; El-Baz, F.

1982-08-01

439

Mode-distribution analysis of quasielastic neutron scattering and application to liquid water.

A quasielastic neutron scattering (QENS) experiment is a particular technique that endeavors to define a relationship between time and space for the diffusion dynamics of atoms and molecules. However, in most cases, analyses of QENS data are model dependent, which may distort attempts to elucidate the actual diffusion dynamics. We have developed a method for processing QENS data without a specific model, wherein all modes can be described as combinations of the relaxations based on the exponential law. By this method, we can obtain a distribution function B(Q,?), which we call the mode-distribution function (MDF), to represent the number of relaxation modes and distributions of the relaxation times in the modes. The deduction of MDF is based on the maximum entropy method and is very versatile in QENS data analysis. To verify this method, reproducibility was checked against several analytical models, such as that with a mode of distributed relaxation time, that with two modes closely located, and that represented by the Kohlrausch-Williams-Watts function. We report the first application to experimental data of liquid water. In addition to the two known modes, the existence of a relaxation mode of water molecules with an intermediate time scale has been discovered. We propose that the fast mode might be assigned to an intermolecular motion and the intermediate motion might be assigned to a rotational motion of the water molecules instead of to the fast mode. PMID:23848682

Kikuchi, Tatsuya; Nakajima, Kenji; Ohira-Kawamura, Seiko; Inamura, Yasuhiro; Yamamuro, Osamu; Kofu, Maiko; Kawakita, Yukinobu; Suzuya, Kentaro; Nakamura, Mitsutaka; Arai, Masatoshi

2013-06-01

440

Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

NASA Technical Reports Server (NTRS)

Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

Yoo, Paul

2013-01-01

441

The examination of brain tumor growth and its variability among cancer patients is an important aspect of epidemiologic and medical data. Several studies for tumors of brain interpreted descriptive data, in this study we perform inference in the extent possible, suggesting possible explanations for the differentiation in the survival rates apparent in the epidemiologic data. Population based information from nine registries in the USA are classified with respect to age, gender, race and tumor histology to study tumor size variation. The Weibull and Dagum distributions are fitted to the highly skewed tumor sizes distributions, the parametric analysis of the tumor sizes showed significant differentiation between sexes, increased skewness for both the male and female populations, as well as decreased kurtosis for the black female population. The effect of population characteristics on the distribution of tumor sizes is estimated by quantile regression model and then compared with the ordinary least squares results. The higher quantiles of the distribution of tumor sizes for whites are significantly higher than those of other races. Our model predicted that the effect of age in the lower quantiles of the tumor sizes distribution is negative given the variables race and sex. We apply probability and regression models to explore the effects of demographic and histology types and observe significant racial and gender differences in the form of the distributions. Efforts are made to link tumor size data with available survival rates in relation to other prognostic variables. PMID:23675268

Pokhrel, Keshav P.; Vovoras, Dimitrios; Tsokos, Chris P.

2012-01-01

442

Phylogenetic Analysis and Comparative Genomics of Purine Riboswitch Distribution in Prokaryotes

Riboswitches are regulatory RNA that control gene expression by undergoing conformational changes on ligand binding. Using phylogenetic analysis and comparative genomics we have been able to identify the class of genes/operons regulated by the purine riboswitch and obtain a high-resolution map of purine riboswitch distribution across all bacterial groups. In the process, we are able to explain the absence of purine riboswitches upstream to specific genes in certain genomes. We also identify the point of origin of various purine riboswitches and argue that not all purine riboswitches are of primordial origin, and that some purine riboswitches must have originated after the divergence of certain Firmicute orders in the course of evolution. Our study also reveals the role of horizontal transfer events in accounting for the presence of purine riboswitches in some gammaproteobacterial species. Our work provides significant insights into the origin, distribution and regulatory role of purine riboswitches in prokaryotes. PMID:23170063

Singh, Payal; Sengupta, Supratim

2012-01-01

443

Analysis and modeling of information flow and distributed expertise in space-related operations.

Evolving space operations requirements and mission planning for long-duration expeditions require detailed examinations and evaluations of information flow dynamics, knowledge-sharing processes, and information technology use in distributed expert networks. This paper describes the work conducted with flight controllers in the Mission Control Center (MCC) of NASA's Johnson Space Center. This MCC work describes the behavior of experts in a distributed supervisory coordination framework, which extends supervisory control/command and control models of human task performance. Findings from this work are helping to develop analysis techniques, information architectures, and system simulation capabilities for knowledge sharing in an expert community. These findings are being applied to improve knowledge-sharing processes applied to a research program in advanced life support for long-duration space flight. Additional simulation work is being developed to create interoperating modules of information flow and novice/expert behavior patterns. PMID:15835058

Caldwell, Barrett S

2005-01-01

444

This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

2013-01-01

445

Preliminary analysis of the span-distributed-load concept for cargo aircraft design

NASA Technical Reports Server (NTRS)

A simplified computer analysis of the span-distributed-load airplane (in which payload is placed within the wing structure) has shown that the span-distributed-load concept has high potential for application to future air cargo transport design. Significant increases in payload fraction over current wide-bodied freighters are shown for gross weights in excess of 0.5 Gg (1,000,000 lb). A cruise-matching calculation shows that the trend toward higher aspect ratio improves overall efficiency; that is, less thrust and fuel are required. The optimal aspect ratio probably is not determined by structural limitations. Terminal-area constraints and increasing design-payload density, however, tend to limit aspect ratio.

Whitehead, A. H., Jr.

1975-01-01

446

EXERGY ANALYSIS OF THE CRYOGENIC HELIUM DISTRIBUTION SYSTEM FOR THE LARGE HADRON COLLIDER (LHC)

The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U. [CERN, CH-1211, Geneva, 23 (Switzerland)

2010-04-09

447

Methods and apparatuses for information analysis on shared and distributed computing systems

Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

2011-02-22

448

Temporal modeling frameworks often operate on scalar variables by summarizing data at initial stages as statistical summaries of the underlying distributions. For instance, DTI analysis often employs summary statistics, like mean, for regions of interest and properties along fiber tracts for population studies and hypothesis testing. This reduction via discarding of variability information may introduce significant errors which propagate through the procedures. We propose a novel framework which uses distribution-valued variables to retain and utilize the local variability information. Classic linear regression is adapted to employ these variables for model estimation. The increased stability and reliability of our proposed method when compared with regression using single-valued statistical summaries, is demonstrated in a validation experiment with synthetic data. Our driving application is the modeling of age-related changes along DTI white matter tracts. Results are shown for the spatiotemporal population trajectory of genu tract estimated from 45 healthy infants and compared with a Krabbe’s patient. PMID:25356194

Sharma, Anuja; Fletcher, P. Thomas; Gilmore, John H.; Escolar, Maria L.; Gupta, Aditya; Styner, Martin; Gerig, Guido

2014-01-01

449

Ion energy distribution analysis of the TVA plasma ignited in carbon vapours using RFA

NASA Astrophysics Data System (ADS)

In order to understand plasma processes and to obtain technological control in thin film deposition, the study of surface-plasma interactions is essential. Apart from the type and flux of the impinging ions/neutral atoms on the surface, the ion energy distribution (IED) is an important parameter in understanding surface modification due to the plasma. In this paper, results of ion energy analysis of the Thermionic Vacuum Arc (TVA) plasma ignited in carbon vapours are presented. An in-house, computer-controlled retarding field analyzer was used for determining experimentally ion energy distributions of the carbon ions arriving at the substrate. The correlation of the carbon IED with the applied arc voltage in the TVA plasma was put in evidence for the first time.

Surdu-Bob, C. C.; Badulescu, M.; Iacob, C.; Porosnicu, C.; Lungu, C. P.

2010-01-01

450

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

Wilhelmson, Robert; Koch, Steven

1992-01-01

451

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and designed to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three-dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

Wilhelmson, Robert; Koch, Steven

1993-01-01

452

X-ray fluorescence analysis of iron and manganese distribution in primary dopaminergic neurons.

Transition metals have been suggested to play a pivotal role in the pathogenesis of Parkinson's disease. X-ray microscopy combined with a cryogenic setup is a powerful method for elemental imaging in low concentrations and high resolution in intact cells, eliminating the need for fixation and sectioning of the specimen. Here, we performed an elemental distribution analysis in cultured primary midbrain neurons with a step size in the order of 300 nm and ~ 0.1 ppm sensitivity under cryo conditions by using X-ray fluorescence microscopy. We report the elemental mappings on the subcellular level in primary mouse dopaminergic (DAergic) and non-DAergic neurons after treatment with transition metals. Application of Fe(2+) resulted in largely extracellular accumulation of iron without preference for the neuronal transmitter subtype. A quantification of different Fe oxidation states was performed using X-ray absorption near edge structure analysis. After treatment with Mn(2+) , a cytoplasmic/paranuclear localization of Mn was observed preferentially in DAergic neurons, while no prominent signal was detectable after Mn(3+) treatment. Immunocytochemical analysis correlated the preferential Mn uptake to increased expression of voltage-gated calcium channels in DAergic neurons. We discuss the implications of this differential elemental distribution for the selective vulnerability of DAergic neurons and Parkinson's disease pathogenesis. PMID:23106162

Du?i?, Tanja; Barski, Elisabeth; Salome, Murielle; Koch, Jan C; Bähr, Mathias; Lingor, Paul

2013-01-01

453

A method is proposed for the local atomic distribution function analysis of amorphous materials. This method is based on local halo-electron diffraction intensity analysis with nano-sized electron probes as small as 25 to approximately 3 nm, taking advantage of the intensity recording with imaging plate. Nanodiffraction and selected area electron diffraction (SAED) patterns from an amorphous SiNx (x approximately 4/3) thin film were taken using a conventional transmission electron microscope operated at 200 kV and recorded on imaging plates. An intensity correction to omit inelastic intensity was made using electron energy-loss spectroscopy. When a beam-convergence angle is larger than 1 x 10(-3) rad, the Wiener-filter deconvolution method becomes helpful in producing atomic pair distribution functions (PDFs) from the nano-diffraction intensity profiles that are more similar to the PDF from the SAED intensity. This technique was applied to the analysis of local amorphous structures of SiO2 layers formed by an oxygen-ion implantation into single crystal SiC. PMID:11918407

Hirotsu, Y; Ishimaru, M; Ohkubo, T; Hanada, T; Sugiyama, M

2001-01-01

454

NASA Astrophysics Data System (ADS)

Foraminiferal data from 36 offshore wells on the Labrador Shelf, Grand Banks, and Scotian Shelf have been analyzed statistically for biostratigraphic correlation and for systematic trends in distribution related to paleobiogeography. Ranking and Scaling (RASC) of the data allows the recognition of reliable assemblage zones, grouped for this analysis into six well-defined time slices. Correspondence analysis shows clearly geographic trends in faunal distribution, differing according to latitude. About one-half of the taxa are planktonic; many of these are restricted to southern and more offshore wells that were influenced by the presence of a proto-Gulf Stream. The remaining taxa are predominantly benthonic, and may be allocated broadly to two groups, one with widespread species occurring throughout the region, and a smaller group that is restricted to northern wells on the Labrador Shelf, possibly favored by the influence of terrigenous sediment supply. This threefold effect of southern planktonics, ubiquitous benthonics, and minor northern benthonics is recognized throughout the Cenozoic, with minor fluctuations. During Middle—Late Eocene, a large percentage of taxa are restricted northerly benthonics, reflecting the fossilferous, thick terrigenous mudstone sequence in northern wells. During Early—Middle Miocene, the southerly restricted planktonics predominate, reflecting Gulf Stream influence during climatic warming. In the late Neogene, a small group of benthonics are relatively ubiquitous due to the onset of the shelfbound Labrador current. The combined use of RASC and correspondence analysis provides a good tool for unscrambling the influence of both time and paleoenvironment on this dataset.

Bonham-Carter, G. F.; Gradstein, F. M.; D'lorio, M. A.

455

X-ray fluorescence analysis of iron and manganese distribution in primary dopaminergic neurons

Transition metals have been suggested to play a pivotal role in the pathogenesis of Parkinson's disease. X-ray microscopy combined with a cryogenic setup is a powerful method for elemental imaging in low concentrations and high resolution in intact cells, eliminating the need for fixation and sectioning of the specimen. Here, we performed an elemental distribution analysis in cultured primary midbrain neurons with a step size in the order of 300 nm and ? 0.1 ppm sensitivity under cryo conditions by using X-ray fluorescence microscopy. We report the elemental mappings on the subcellular level in primary mouse dopaminergic (DAergic) and non-DAergic neurons after treatment with transition metals. Application of Fe2+ resulted in largely extracellular accumulation of iron without preference for the neuronal transmitter subtype. A quantification of different Fe oxidation states was performed using X-ray absorption near edge structure analysis. After treatment with Mn2+, a cytoplasmic/paranuclear localization of Mn was observed preferentially in DAergic neurons, while no prominent signal was detectable after Mn3+ treatment. Immunocytochemical analysis correlated the preferential Mn uptake to increased expression of voltage-gated calcium channels in DAergic neurons. We discuss the implications of this differential elemental distribution for the selective vulnerability of DAergic neurons and Parkinson's disease pathogenesis. PMID:23106162

Du?i?, Tanja; Barski, Elisabeth; Salome, Murielle; Koch, Jan C; Bähr, Mathias; Lingor, Paul

2013-01-01

456

Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M1). The antagonistic crowns M1 and P2–M1 of two dried modern human skulls were scanned by ?CT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M1 and P2–M1 was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M1 in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth. PMID:21615398

Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

2011-01-01

457

NASA Astrophysics Data System (ADS)

The XMM-Newton Scientific Analysis System (SAS) is the software used for the reduction and calibration of data taken with the XMM-Newton satellite instruments leading to almost 400 refereed scientific papers published in the last 2.5 years. Its maintenance, further development and distribution is under the responsibility of the XMM-Newton Science Operations Centre together with the Survey Science Centre, representing a collaborative effort of more than 30 scientific institutes. Developed in C++, Fortran 90/95 and Perl, the SAS makes large use of open software packages such as ds9 for image display (SAO-R&D Software Suite), Grace, LHEASOFT and cfitsio (HEASARC project), pgplot, fftw and the non-commercial version of Qt (TrollTech). The combination of supporting several versions of SAS for multiple platforms (including SunOS, DEC, many Linux flavours and MacOS) in a widely distributed development process which makes use of a suite of external packages and libraries presents substantial issues for the integrity of the SAS maintenance and development. A further challenge comes from the necessity of maintaining the flexibility of a software package evolving together with progress made in instrument calibration and analysis refinement, whilst at the same time being the source of all official products of the XMM-Newton mission. To cope with this requirement, a sophisticated system for continuous integration and testing on several platforms of different branches has been put in place on top of a refined development model designed for this special S/W development case. The SAS is considered now a mature system. We present the different aspects of its development, maintenance and distribution, extracting lessons learned for present and future projects of this magnitude.

Gabriel, C.; Denby, M.; Fyfe, D. J.; Hoar, J.; Ibarra, A.; Ojero, E.; Osborne, J.; Saxton, R. D.; Lammers, U.; Vacanti, G.

2004-07-01

458

This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936

Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.

2012-01-01

459

This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds' constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold's properties (as rods' successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics' mechanical properties. PMID:23439936

Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P

2013-04-01

460

Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376

Taravat, Alireza; Oppelt, Natascha

2014-01-01