For comprehensive and current results, perform a real-time search at Science.gov.

1

Using Weibull Distribution Analysis to Evaluate ALARA Performance

As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

2009-10-01

2

Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

NASA Technical Reports Server (NTRS)

Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

1992-01-01

3

The beta exponentiated Weibull distribution

The Weibull distribution is one of the most important distributions in reliability. For the first time, we introduce the beta exponentiated Weibull distribution which extends recent models by Lee et al. [Beta-Weibull distribution: some properties and applications to censored data, J. Mod. Appl. Statist. Meth. 6 (2007), pp. 173–186] and Barreto-Souza et al. [The beta generalized exponential distribution, J. Statist.

Gauss M. Cordeiro; Antonio Eduardo Gomes; Cibele Queiroz da-Silva; Edwin M. M. Ortega

2011-01-01

4

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

Krantz, Timothy L.

2002-01-01

5

NASA Technical Reports Server (NTRS)

The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

Kranz, Timothy L.

2002-01-01

6

< for Weibull Distribution Based on Censored Samples

This paper deals with the estimation of RP (Y X) = < where X and Y are two independent Weibull distributions with different scale parameters but having the same shape parameter. The results are based on censored data. Different methods for estimating R are proposed. The MLE, UMVUE and Bayes estimators are obtained. A numerical illustration presented to compare the

PY X; Abd Elfattah

7

Weibull distribution parameters for fault feature extraction of rolling bearing

A novel approach to fault feature extraction using Weibull distribution parameters is proposed. After the original signal of bearing vibration is modeled as the Weibull distribution, its scale parameter is extracted as a new feature vector for the bearing running state. The tests results of fault diagnosis of the rolling bearing verify that this new feature can catch the regularity

Peng Tao; Jiang Haiyan; Xie Yong

2011-01-01

8

Note on inventory models with Weibull distribution deterioration

This article explores the inventory model with a general demand rate function in which both the Weibull distributed deterioration and partial backlogging are considered. The inventory model discussed here is based on the important finding by Wu [Wu, K.S., 2001. An EOQ inventory model for items with Weibull distribution deterioration, ramp type demand rate and partial backlogging. Production Planning and

Gino K. Yang; Robert Lin; Jennifer Lin; Kuo-Chen Hung; Peter Chu; Wayne Chouhuang

2011-01-01

9

Program for Weibull Analysis of Fatigue Data

NASA Technical Reports Server (NTRS)

A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

Krantz, Timothy L.

2005-01-01

10

Weibull distribution based on maximum likelihood with interval inspection data

NASA Technical Reports Server (NTRS)

The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

Rheinfurth, M. H.

1985-01-01

11

Weibull parameters for wind speed distribution in Saudia Arabia

The shape and scale parameters of a Weibull density distribution function are calculated for 10 locations in Saudi Arabia. The daily mean wind speed data from 1970 to mid-1990 are used for this purpose. It is found that the numerical values of the shape parameter vary between 1.7 and 2.7, whereas the value of the scale parameter is found to vary between 3 and 6. It is also concluded from this study that wind data are very well represented by the Weibull distribution function.

Rehman, S.; Halawani, T.O.; Husain, T. (King Fahd Univ. of Petroleum Minerals, Dhahran (Saudi Arabia))

1994-12-01

12

Investigation of Weibull statistics in fracture analysis of cast aluminum

NASA Technical Reports Server (NTRS)

The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

Holland, F. A., Jr.; Zaretsky, E. V.

1989-01-01

13

Weibull parameters for wind speed distribution in Saudia Arabia

The shape and scale parameters of a Weibull density distribution function are calculated for 10 locations in Saudi Arabia. The daily mean wind speed data from 1970 to mid-1990 are used for this purpose. It is found that the numerical values of the shape parameter vary between 1.7 and 2.7, whereas the value of the scale parameter is found to

S. Rehman; T. O. Halawani; T. Husain

1994-01-01

14

SIGNIFICANCE OF THE WEIBULL DISTRIBUTION AND ITS SUB-MODELS IN NATURAL IMAGE STATISTICS

SIGNIFICANCE OF THE WEIBULL DISTRIBUTION AND ITS SUB-MODELS IN NATURAL IMAGE STATISTICS Victoria: Natural image statistics, Weibull distribution, model selection. Abstract: The contrast statistics the occurrence of the content classes, as related to the global statistics, local statistics, and to human

Geusebroek, Jan-Mark

15

Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

NASA Technical Reports Server (NTRS)

A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

2012-01-01

16

Parameter estimation for a modified Weibull distribution, for progressively type-II censored samples

In this paper, the estimation of parameters based on a progressively Type-II censored sample from a modified Weibull distribution is studied. The likelihood equations, and the maximum likelihood estimators are derived. The estimators based on a least-squares fit of a multiple linear regression on a Weibull probability paper plot are compared with the MLE via Monte Carlo simulations. The observed

H. K. T. Ng

2005-01-01

17

NASA Astrophysics Data System (ADS)

Manufacturing of AgMg sheathed Bi2Sr2CaCu2O8+x superconducting tapes involves multiple processes. Microstructural studies across tape sections have shown that the microstructure is nonuniform across the tape. These nonuniformities are largely due to manufacturing defects, even in well-controlled manufacturing processes. Consequently, the electrical and mechanical properties vary in these different sections. Here, we report results from analyzing the electromechanical properties of AgMg sheathed Bi2Sr2CaCu2O8+x tapes in different sections using a statistical approach. 24 samples were studied at strains of 0%, 0.25%, and 0.349% for a total of 72 samples. The probability of electrical and mechanical failures of the tapes is then analyzed using two- and three-parameter Weibull distributions. It is found that the mechanical failure of these tapes is homogeneous, consistent with failure in the AgMg sheath, but that the electromechanical failure is inhomogeneous within the conductor and as a function of strain, indicating that this failure is dictated by failure in the inhomogeneous ceramic oxide superconducting filaments. This has important implications for the designs of superconducting magnets.

Mbaruku, A. L.; Schwartz, J.

2007-04-01

18

NASA Astrophysics Data System (ADS)

The powder sample of nickel oxide was synthesized by sol gel procedure. The isothermal reduction of nickel oxide using hydrogen was investigated by thermogravimetric analysis at five operating temperatures: 245, 255, 265, 275 and 300 °C. The kinetic triplet (Ea, A and f(?)) was determined using conventional and Weibull kinetic analysis. Both the kinetically procedures show that the reduction process considered can be explained with a two-step kinetic model. It is established that at lower temperatures (245 °C?T?255 °C), the reduction process considered is governed by two-parameter Šesták-Berggren autocatalytic model (first step) and at higher temperatures (T?265 °C), the reduction process is governed by Fn reaction model with different values of parameter n (second step). In this paper, the complex manner of dependence of the Weibull shape parameter (?) on temperature is established. With alterations of Weibull shape parameter from lower temperatures (?>1) to higher temperatures (?<1), it was concluded that isothermal reduction process of NiO using hydrogen can be described by a multistep reaction mechanism. These results are confirmed by the evaluated density distribution functions (ddf) of apparent activation energies (Ea), which show variations in basic characteristics at lower and higher operating temperature regions. Also, in this paper, it was shown that the shape parameter (?) of Weibull distribution function can represent the behaviour index, which indicates the kinetic pattern of the mechanism controlling the process studied.

Jankovi?, B.

2007-12-01

19

The main contribution of this paper is to provide reasonable confidence intervals for maximum likelihood estimates of percentile points in dielectric breakdown voltage probability distributions. The Weibull distributions which include threshold values are often used in breakdown voltage distributions. In some breakdown voltage samples, there exist cases in which the maximum likelihood estimates of all the three Weibull parameters diverge;

H. Hirose

1996-01-01

20

On Weighted Least Squares Estimation for the Parameters of Weibull Distribution

The two-parameter Weibull distribution is one of the most widely used life distributions in reliability studies. It has shown\\u000a to be satisfactory in modeling the phenomena of fatigue and life of many devices such as ball bearings, electric bulbs, capacitors,\\u000a transistors, motors and automotive radiators. In recent years, a number of modifications of the traditional Weibull distribution\\u000a have been proposed

L. F. Zhang; M. Xie; L. C. Tang

21

Predictive Failure of Cylindrical Coatings Using Weibull Analysis

NASA Technical Reports Server (NTRS)

Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.

Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

2002-01-01

22

NASA Technical Reports Server (NTRS)

Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

Gross, Bernard

1996-01-01

23

Is the Weibull distribution really suited for wind statistics modeling and wind power evaluation?

Wind speed statistics is generally modeled using the Weibull distribution. This distribution is convenient since it fully characterizes analytically with only two parameters (the shape and scale parameters) the shape of distribution and the different moments of the wind speed (mean, standard deviation, skewness and kurtosis). This distribution is broadly used in the wind energy sector to produce maps of wind energy potential. However, the Weibull distribution is based on empirical rather than physical justification and might display strong limitations for its applications. The philosophy of this article is based on the modeling of the wind components instead of the wind speed itself. This provides more physical insights on the validity domain of the Weibull distribution as a possible relevant model for wind statistics and the quantification of the error made by using such a distribution. We thereby propose alternative expressions of more suited wind speed distribution.

Drobinski, Philippe

2012-01-01

24

Flexural strength of fused silica: Weibull statistical analysis

NASA Astrophysics Data System (ADS)

The development of high-energy lasers has focused attention on the need to assess the mechanical strength of optical components made of fused silica (Si02). The strength of this material is known to be highly dependent on the stressed area and the surface finish, but has not yet been properly characterized in the published literature. Recently, Detrio and collaborators at the University of Dayton Research Institute (UDRI) performed extensive ring-on-ring flexural strength measurements on fused Si02 specimens ranging in size from 1 to 9 inches in diameter and of widely differing surface quality. In this contribution, we report on a Weibull statistical analysis of the UDRI data-an analysis based on the procedure outlined in Proc. SPIE 4375, 241 (2001). We demonstrate that: (a) a two-parameter Weibull model, including the area-scaling principle, applies; (b) the shape parameter (m is asymptotically equal to 10) is essentially independent of the stressed area as well as the surface finish; (c) the characteristic strength (1-cm2 uniformly stressed area) obeys a linear law, ?C(in MPa) is asymptotically equal to 160 - 2.83x PBSÂ®(in ppm/sr), where PBSÂ® measures the surface/subsurface "damage." In this light, we evaluate the cumulative failure probability of optically polished and superpolished fused Si02 windows as a function of the biaxial tensile stress, for uniformly stressed areas ranging from 0.3 to 100 cm2.

Klein, Claude A.

2009-05-01

25

Composite Weibull-Inverse Transformed Gamma distribution and its actuarial application

NASA Astrophysics Data System (ADS)

This paper introduces a new composite model, namely, composite Weibull-Inverse Transformed Gamma distribution which assumes Weibull distribution for the head up to a specified threshold and inverse transformed gamma distribution beyond it. The closed form of probability density function (pdf) as well as the estimation of parameters by maximum likelihood method is presented. The model is compared with several benchmark distributions and their performances are measured. A well-known data set, Danish fire loss data, is used for this purpose and it's Value at Risk (VaR) using the new model is computed. In comparison to several standard models, the composite Weibull- Inverse Transformed Gamma model proved to be a competitor candidate.

Maghsoudi, Mastoureh; Bakar, Shaiful Anuar Abu; Hamzah, Nor Aishah

2014-07-01

26

NASA Astrophysics Data System (ADS)

The statistical properties of fracture strength of brittle and quasibrittle materials are often described in terms of the Weibull distribution. However, the weakest-link hypothesis, commonly used to justify it, is expected to fail when fracture occurs after significant damage accumulation. Here we show that this implies that the Weibull distribution is unstable in a renormalization-group sense for a large class of quasibrittle materials. Our theoretical arguments are supported by numerical simulations of disordered fuse networks. We also find that for brittle materials such as ceramics, the common assumption that the strength distribution can be derived from the distribution of preexisting microcracks by using Griffith's criteria is invalid. We attribute this discrepancy to crack bridging. Our findings raise questions about the applicability of Weibull statistics to most practical cases.

Bertalan, Zsolt; Shekhawat, Ashivni; Sethna, James P.; Zapperi, Stefano

2014-09-01

27

FITTING WEIBULL AND LOGNORMAL DISTRIBUTIONS TO MEDIUM-DENSITY FIBERBOARD FIBER AND WOOD

FITTING WEIBULL AND LOGNORMAL DISTRIBUTIONS TO MEDIUM-DENSITY FIBERBOARD FIBER AND WOOD PARTICLE, suitable statistical functions that can be used to accurately describe wood fiber length distribution over, Tasman (1972) analyzed fiber lengths with a Bauer-McNett classifier. He discovered that the distribution

28

A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate

Naoya Sazuka; Jun-Ichi Inoue

2007-01-01

29

Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined. PMID:25016270

Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

2014-07-01

30

An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

ERIC Educational Resources Information Center

An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

2005-01-01

31

Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength.

Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth's crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

Krumbholz, Michael; Hieronymus, Christoph F; Burchardt, Steffi; Troll, Valentin R; Tanner, David C; Friese, Nadine

2014-01-01

32

Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

2014-01-01

33

The study presents the analysis of wind speed data from seven stations in Saudi Arabia, measured at 20, 30, and 40 m height above ground level (AGL) over a period varying from 2 to 5 years. Specifically, Weibull parameters were calculated using five different methods, four of them based on the statistical analysis of the collected data and a fifth

Haralambos S. Bagiorgas; Mihalakakou Giouli; Shafiqur Rehman; Luai M. Al-Hadhrami

2011-01-01

34

Estimation of the parameters of the Weibull distribution from multi-censored samples

function. Vaughn [1965] also consi. dered the maximum likelihood estimators of the Weibull distribution from Type I data. His method of iteration was the Newton-Raphson method. 4. UNEQUAL CENSORING NUMBERS Maximum Likelihood Estimation Considering... be by iterative techniques. Equation (4. 13) is to be solved for o by the Newton-Raphson method and then 8 will be calculated from equation (4. 12) with the value of u substituted for The Newton-Raphson method is applicable to functions of one variable which...

Sprinkle, Edgar Eugene

2012-06-07

35

Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

NASA Technical Reports Server (NTRS)

Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.

Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.

2007-01-01

36

Weibull mixture model for isoconversional kinetic analysis of biomass oxidative pyrolysis

NASA Astrophysics Data System (ADS)

In this work, the possibility of applying the weighted sum of three cumulative Weibull distribution functions for the fitting of the kinetic conversion data of biomass oxidative pyrolysis has been investigated. The kinetic conversion data of the thermal decomposition of olive oil solid waste in oxygen atmosphere for different heating rates have been analyzed. The results have shown that the experimental data can be perfectly reproduced by the general fitting function. Therefore, it is possible to obtain the corresponding conversion rate values of biomass oxidative pyrolysis by differentiating directly the fitted kinetic conversion data. Additionally, the logistic mixture model has been applied to the same experimental data. It can be found that the newly proposed function can provide a better fit of the data than the logistic mixture model. Based on the fitting of Weibull mixture model, the kinetic triples (E, A and f(?)) of oxidative pyrolysis of olive solid waste were obtained by means of Friedman's differential isoconversional method.

Cai, J. M.; Chen, S. Y.

2010-03-01

37

We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

38

Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

NASA Technical Reports Server (NTRS)

Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

2013-01-01

39

Visual fields measured with standard automated perimetry are a benchmark test for determining retinal function in ocular pathologies such as glaucoma. Their monitoring over time is crucial in detecting change in disease course and, therefore, in prompting clinical intervention and defining endpoints in clinical trials of new therapies. However, conventional change detection methods do not take into account non-stationary measurement variability or spatial correlation present in these measures. An inferential statistical model, denoted ‘Analysis with Non-Stationary Weibull Error Regression and Spatial enhancement’ (ANSWERS), was proposed. In contrast to commonly used ordinary linear regression models, which assume normally distributed errors, ANSWERS incorporates non-stationary variability modelled as a mixture of Weibull distributions. Spatial correlation of measurements was also included into the model using a Bayesian framework. It was evaluated using a large dataset of visual field measurements acquired from electronic health records, and was compared with other widely used methods for detecting deterioration in retinal function. ANSWERS was able to detect deterioration significantly earlier than conventional methods, at matched false positive rates. Statistical sensitivity in detecting deterioration was also significantly better, especially in short time series. Furthermore, the spatial correlation utilised in ANSWERS was shown to improve the ability to detect deterioration, compared to equivalent models without spatial correlation, especially in short follow-up series. ANSWERS is a new efficient method for detecting changes in retinal function. It allows for better detection of change, more efficient endpoints and can potentially shorten the time in clinical trials for new therapies. PMID:24465636

Zhu, Haogang; Russell, Richard A.; Saunders, Luke J.; Ceccon, Stefano; Garway-Heath, David F.; Crabb, David P.

2014-01-01

40

We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels. PMID:24979434

Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

2014-06-20

41

Software for Weibull Inference

Exact inference, that is, confidence limits and hypothesis tests for the Weibull distribution parameters and percentiles based on maximum likelihood estimation in complete or type II censored samples requires the determination via simulation of percentage points of the distribution of certain pivotal quantities. Tables have been published for a limited range of sample sizes and are scattered through the literature.

John I. McCool

2011-01-01

42

Effects of thermal cycling and surface roughness on the Weibull distribution of porcelain strength.

The objective of this study was to test the hypothesis that thermal cycling weakens the flexural strength of porcelain. Specimens of Deguceram Gold and Vita Omega 900 were tested in four groups of 30 specimens each: in the original glazed condition versus being ground with 1000-grit, 600-grit, and 100-grit silicon carbide abrasives. Corresponding to these four types of surface treatments, four groups of 30 specimens per group received 5,000 times of thermal cycling. Flexural strength was measured using a four-point flexural test, and Weibull modulus was calculated. Within each type of surface treatment, the thermal cycling treatment did not result in any decrease in flexural strength although it caused the Weibull modulus to become smaller - except for the control and thermal-cycled groups of 600-grit surface treatment. PMID:19721280

Nakamura, Yoshiharu; Hojo, Satoru; Sato, Hideaki

2009-07-01

43

A delay metric for RC circuits based on the Weibull distribution

Physical design optimizations such as placement, interconnect synthesis, oorplanning, and routing require fast and accurate analysis of RC networks. Because of its simple close form and fast evaluation, the Elmore delay metric has been widely adopted. The recently proposed delay metrics PRIMO and H-gamma match the rst three circuit moments to the probability density function of a Gamma statistical distribution.

Frank Liu; Chandramouli V. Kashyap; Charles J. Alpert

2002-01-01

44

A possible choice for distributions of duration to describe first- passage processes of financial markets is discussed. To represent market data which possess a relatively long duration, we use two types of distri- butions, namely, a distribution derived from the so-called Mittag-Leffler survival function and a Weibull distribution. For the survival function of Mittag-Leffler type, we find that the average

Naoya Sazuka; Jun-ichi Inoue; Enrico Scalas

45

Analysis of interval-censored data with Weibull lifetime distribution

& OR Unit, Indian Statistical Institute, 203 B.T. Road, Kolkata, Pin 700108, India. Department of Mathematics and Statistics, Indian Institute of Technology Kanpur, Pin 208016, India. Corresponding author, e in diverse fields, such as biology, demography, eco- #12;3 nomics, engineering, epidemiology, medicine

Kundu, Debasis

46

Pitfalls in Using Weibull Tailed Distributions Alexandru V. Asimit, Deyuan Li & Liang Peng

distributions which are more useful in estimating high quantiles and extreme tail probabilities. KEY WORDS, estimating has attracted much attention recently. Accurate estimate of the probabilities associated probability. There exist various estimators for in the literature; see Beirlant, Bouquiaux and Werker (2006

Sidorov, Nikita

47

Inference for a Simple StepStress Model With Type-II Censoring, and Weibull Distributed Lifetimes

The simple step-stress model under type-II censoring based on Weibull lifetimes, which provides a more flexible model than the exponential model, is considered in this paper. For this model, the maximum likelihood estimates (MLE) of its parameters, as well as the corresponding observed Fisher information matrix, are derived. The likelihood equations do not lead to closed-form expressions for the MLE,

Maria Kateri; Narayanaswamy Balakrishnan

2008-01-01

48

Analysis of DC current accelerated life tests of GaN LEDs using a Weibull-based statistical model

Gallium-nitride-based light-emitting diode (LED) accelerated life tests were carried out over devices adopting two different packaging schemes (i.e., with plastic transparent encapsulation or with pure metallic package). Data analyses were done using a Weibull-based statistical description with the aim of estimating the effect of high current on device performance. A consistent statistical model was found with the capability to estimate

S. Levada; M. Meneghini; G. Meneghesso; E. Zanoni

2005-01-01

49

Finite-size effects on return interval distributions for weakest-link-scaling systems

NASA Astrophysics Data System (ADS)

The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the ?-Weibull distribution. The upper tail of the ?-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the ?-Weibull distribution decreases linearly after a waiting time ?c?n1/m, where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the ? Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the ?-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems.

Hristopulos, Dionissios T.; Petrakis, Manolis P.; Kaniadakis, Giorgio

2014-05-01

50

Weibull-like Model of Cancer Development in Aging.

Mathematical modeling of cancer development is aimed at assessing the risk factors leading to cancer. Aging is a common risk factor for all adult cancers. The risk of getting cancer in aging is presented by a hazard function that can be estimated from the observed incidence rates collected in cancer registries. Recent analyses of the SEER database show that the cancer hazard function initially increases with the age, and then it turns over and falls at the end of the lifetime. Such behavior of the hazard function is poorly modeled by the exponential or compound exponential-linear functions mainly utilized for the modeling. In this work, for mathematical modeling of cancer hazards, we proposed to use the Weibull-like function, derived from the Armitage-Doll multistage concept of carcinogenesis and an assumption that number of clones at age t developed from mutated cells follows the Poisson distribution. This function is characterized by three parameters, two of which (r and ?) are the conventional parameters of the Weibull probability distribution function, and an additional parameter (C(0)) that adjusts the model to the observational data. Biological meanings of these parameters are: r-the number of stages in carcinogenesis, ?-an average number of clones developed from the mutated cells during the first year of carcinogenesis, and C(0)-a data adjustment parameter that characterizes a fraction of the age-specific population that will get this cancer in their lifetime. To test the validity of the proposed model, the nonlinear regression analysis was performed for the lung cancer (LC) data, collected in the SEER 9 database for white men and women during 1975-2004. Obtained results suggest that: (i) modeling can be improved by the use of another parameter A- the age at the beginning of carcinogenesis; and (ii) in white men and women, the processes of LC carcinogenesis vary by A and C(0), while the corresponding values of r and ? are nearly the same. Overall, the proposed Weibull-like model provides an excellent fit of the estimates of the LC hazard function in aging. It is expected that the Weibull-like model can be applicable to fit estimates of hazard functions of other adult cancers as well. PMID:20838610

Mdzinarishvili, Tengiz; Sherman, Simon

2010-01-01

51

Modeling root reinforcement using root-failure Weibull survival function

NASA Astrophysics Data System (ADS)

Root networks contribute to slope stability through complicated interactions that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamic of root turnover, the quantification of root reinforcement on steep slope is challenging and consequently the calculation of slope stability as well. Although the considerable advances in root reinforcement modeling, some important aspect remain neglected. In this study we address in particular to the role of root strength variability on the mechanical behaviors of a root bundle. Many factors may contribute to the variability of root mechanical properties even considering a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field datasets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the tensile force and the elasticity of the roots are the most important equations, as well as the root distribution. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for tensile, shear and compression behavior allows the consideration of the stabilization effects of root networks on steep slopes and the influence that this has on the triggering of shallow landslides.

Schwarz, M.; Giadrossich, F.; Cohen, D.

2013-03-01

52

Estimating System Reliability of Competing Weibull Failures with Censored Sampling

In this paper, we consider the estimation of R = P(Y < X) where X and Y have two independent Weibull distributions with dierent scale para- meters and the same shape parameter. We used dierent methods for estimating R. Assuming that the common shape parameter is known, the maximum like- lihood, uniformly minimum variance unbiased and Bayes estimators for R

A. M. Abd-Elfattah; Marwa O. Mohamed; Saudia Arabia

2009-01-01

53

Cruciform beam fracture mechanics specimensl have been developed in the Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far- field, out-of-plane biaxird bending stress component in the test section that approximates the nonlinear biaxial stresses resulting from pressurized-thernxd-shock or pressure-temperature loading of a nuclear reactor pressure vessel (RPV). Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shtdlow, surface flaws. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. Two and three- parameter Weibull models have been calibrated using a new scheme (developed at the University of Illinois) that maps toughness data from test specimens with distinctly different levels of crack-tip constraint to a small scale yielding (SSY) Weibull stress space. These models, using the new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the OW integral definition, have been shown to correlate the experimentally observed biaxiaI effect in cruciform specimens, thereby providing a scaling mechanism between uniaxial and biaxial loading states.

Bass, B.R.; McAfee, W.J.; Williams, P.T.

1999-08-01

54

Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

NASA Astrophysics Data System (ADS)

This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ? 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (?0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

Pasari, Sumanta; Dikshit, Onkar

2014-07-01

55

WEIBULL PARAMETRELER? VE YÜZDEL?KLER? ?Ç?N GÜVEN ARALI?I TAHM?N ALGOR?TMALARI

This study concerns the use of Weibull distribution in statistical component reliability. Recently, estimation of confidence intervals and confidence lower bounds for Weibull parameters and percentiles in small samples has received increasing attention in the literature. In expensive or long experiments, it is crucial to keep the sample size to a minimum, however, the estimates become less reliable as the

Mehmet Akif DANACI; Endüstri Mühendisli?i Bölümü

2009-01-01

56

Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter ? for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics. PMID:15274750

Cherkasov, Artem; Ho Sui, Shannan J; Brunham, Robert C; Jones, Steven JM

2004-01-01

57

NASA Astrophysics Data System (ADS)

The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

Williams, Mike; Egede, Ulrik; Paterson, Stuart; LHCb Collaboration

2011-12-01

58

NASA Astrophysics Data System (ADS)

Recently a new proposal to model the fading channel in free-space optical links, namely, the exponentiated Weibull (EW) distribution, has been made. It has been suggested that the EW distribution can model the probability density function (PDF) of the irradiance under weak-to-strong conditions in the presence of aperture averaging. Here, we carry out an analysis of probability of fade and bit error-rate (BER) performance using simulation results and experimental data. The BER analysis assumes intensity modulation/direct detection with on-off keying, and new expressions are derived. Data is modeled following the statistics of the EW fading channel model, and compared with the Gamma-Gamma and Lognormal distributions, as the most widely accepted models nowadays. It is shown how the proposed EW model is valid in all the tested conditions, and even outperforms the GG and LN distributions, that are only valid under certain scenarios.

Barrios, Ricardo; Dios, Federico

2012-10-01

59

Modeling root reinforcement using a root-failure Weibull survival function

NASA Astrophysics Data System (ADS)

Root networks contribute to slope stability through complex interactions with soil that include mechanical compression and tension. Due to the spatial heterogeneity of root distribution and the dynamics of root turnover, the quantification of root reinforcement on steep slopes is challenging and consequently the calculation of slope stability also. Although considerable progress has been made, some important aspects of root mechanics remain neglected. In this study we address specifically the role of root-strength variability on the mechanical behavior of a root bundle. Many factors contribute to the variability of root mechanical properties even within a single class of diameter. This work presents a new approach for quantifying root reinforcement that considers the variability of mechanical properties of each root diameter class. Using the data of laboratory tensile tests and field pullout tests, we calibrate the parameters of the Weibull survival function to implement the variability of root strength in a numerical model for the calculation of root reinforcement (RBMw). The results show that, for both laboratory and field data sets, the parameters of the Weibull distribution may be considered constant with the exponent equal to 2 and the normalized failure displacement equal to 1. Moreover, the results show that the variability of root strength in each root diameter class has a major influence on the behavior of a root bundle with important implications when considering different approaches in slope stability calculation. Sensitivity analysis shows that the calibration of the equations of the tensile force, the elasticity of the roots, and the root distribution are the most important steps. The new model allows the characterization of root reinforcement in terms of maximum pullout force, stiffness, and energy. Moreover, it simplifies the implementation of root reinforcement in slope stability models. The realistic quantification of root reinforcement for tensile, shear and compression behavior allows for the consideration of the stabilization effects of root networks on steep slopes and the influence that this has on the triggering of shallow landslides.

Schwarz, M.; Giadrossich, F.; Cohen, D.

2013-11-01

60

Markovian & Weibull Approach to Concrete Pavement Monitoring

Â Impervious Concrete Pavement 402 kPa 583 kPa 779 kPa #12;#12;Skid Resistance 0 10 20 30 40 50 60 70 4-Jun-09Markovian & Weibull Approach to Concrete Pavement Monitoring Bernard Igbafen Izevbekhai, P.E., Ph) Sensor Location (mm) MnROAD Cell 85 : Pervious Concrete Pavement 393 kPa 574 kPa 789 kPa 0 20 40 60 80

Minnesota, University of

61

NASA Astrophysics Data System (ADS)

Analogous to crustal earthquakes in natural fault systems, we here consider the dynasty collapses as extreme events in human society. Duration data of ancient Chinese and Egyptian dynasties provides a good chance of exploring the collective behavior of the so-called social atoms. By means of the rank-ordering statistics, we demonstrate that the duration data of those ancient dynasties could be described with good accuracy by the Weibull distribution. It is thus amazing that the distribution of time to failure of human society, i.e. the disorder of a historical dynasty, follows the widely accepted Weibull process as natural material fails.

Chen, Chien-Chih; Tseng, Chih-Yuan; Telesca, Luciano; Chi, Sung-Ching; Sun, Li-Chung

2012-02-01

62

Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

NASA Technical Reports Server (NTRS)

Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

Bavuso, Salvatore J.

1998-01-01

63

A Weibull brittle material failure model for the ABAQUS computer program

A statistical failure theory for brittle materials that traces its origins to the Weibull distribution function is developed for use in the general purpose ABAQUS finite element computer program. One of the fundamental assumptions for this development is that Mode 1 microfractures perpendicular to the direction of the principal stress contribute independently to the fast fracture. The theory is implemented by a user subroutine for ABAQUS. Example problems illustrating the capability and accuracy of the model are given. 24 refs., 12 figs.

Bennett, J.

1991-08-01

64

Incorporating finite element analysis into component life and reliability

NASA Technical Reports Server (NTRS)

A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

August, Richard; Zaretsky, Erwin V.

1991-01-01

65

Time-dependent fiber bundles with local load sharing. II. General Weibull fibers

NASA Astrophysics Data System (ADS)

Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ? , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, ? . Thus the failure rate of a fiber depends on its past load history, except for ?=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (?,?) pairs that yield contrasting behavior for large N . For ?>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N?? , unlike ELS, which yields a finite limiting mean. For 1/2???1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ?=1 ) with an asymptotic Gaussian lifetime distribution and a finite limiting mean for large N . The coefficient of variation follows a power law in increasing N but, except for ?=1 , the value of the negative exponent is clearly less than 1/2 unlike in ELS bundles where the exponent remains 1/2 for 1/2distribution for the longest lived of a parallel group of independent elements, which applies exactly to ?=0 . The lower the value of ? , the higher the transition value of ? , below which such extreme-value behavior occurs. No evidence was found for limiting Gaussian behavior for ?>1 but with 0Weibull exponent for fiber strength.

Phoenix, S. Leigh; Newman, William I.

2009-12-01

66

Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

NASA Technical Reports Server (NTRS)

Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

Holland, Frederic A., Jr.; Zaretsky, Erwin V.

1991-01-01

67

Effect of Individual Component Life Distribution on Engine Life Prediction

NASA Technical Reports Server (NTRS)

The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

2003-01-01

68

Mixture distributions of wind speed in the UAE

NASA Astrophysics Data System (ADS)

Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.

Shin, J.; Ouarda, T.; Lee, T. S.

2013-12-01

69

This paper aims to quantify the influence that probability distribution selected to fit wind speed data has on the estimation of the annual mean energy production of wind turbines. To perform this task, a comparative analysis between the well-known two parameter wind speed Weibull distribution and alternative mixture of finite distribution models (less simple but providing better fits in many

Romero-Ternero Vicente

2008-01-01

70

A novel approach for statistical analysis of comet assay data (i.e.: tail moment) is proposed, employing public-domain statistical software, the R system. The analytical strategy takes into account that the distribution of comet assay data, like the tail moment, is usually skewed and do not follow a normal distribution. Probability distributions used to model comet assay data included: the Weibull,

Pablo E. Verde; Laura A. Geracitano; Lílian L. Amado; Carlos E. Rosa; Adalto Bianchini; José M. Monserrat

2006-01-01

71

Distributed data analysis in ATLAS

NASA Astrophysics Data System (ADS)

Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

Nilsson, Paul; Atlas Collaboration

2012-12-01

72

Application of Weibull Criterion to failure prediction in compsites

Fiber-reinforced composite materials are being widely used in engineered structures. This report examines how the general form of the Weibull Criterion, including the evaluation of the parameters and the scaling of the parameter values, can be used for the prediction of component failure.

Cain, W. D.; Knight, Jr., C. E.

1981-04-20

73

A general Bayes weibull inference model for accelerated life testing

for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull's, it is often too time consuming and too costly to test these items in their use (or nominal) environmentÃ? has become a standard procedure [1] to test these items under more severe environments than

van Dorp, Johan RenÃ©

74

NASA Astrophysics Data System (ADS)

In this work, a model is proposed for heterogeneous nucleation on substrates whose size distribution can be described by the Weibull statistics. It is found that the nuclei density, N nuc can be given in terms of the maximum undercooling, ?T m , by N nuc = N s exp ( b/?T m ), where N s is the density of nucleation sites in the melt and b is the nucleation coefficient (b > 0). When nucleation occurs on all possible substrates, the graphite nodule density, N V,n , or eutectic cell density, N V , after solidification equals N s . In this work, measurements of N V,n and N V values were carried out on experimental nodular and flake graphite iron castings processed under various inoculation conditions. The volumetric nodule N V,n or graphite eutectic cell N V count was estimated from the area nodule count, N A,n , or eutectic cell count, N A , on polished cast iron surface sections by stereological means. In addition, maximum undercoolings, ?T m , were measured using thermal analysis. The experimental outcome indicates that the N V,n or N V count can be properly described by the proposed expression N V,n = N V = N s exp ( b/?T m ). Moreover, the N s and b values were experimentally determined. In particular, the proposed model suggests that the size distribution of nucleation sites is exponential in nature.

Fra?, E.; Wiencek, K.; Górny, M.; López, H. F.

2007-02-01

75

A Monte Carlo simulation is used to obtain the statistical properties of the Weibull parameters estimated by the linear regression, weighted linear regression and maximum likelihood schemes, respectively. Results reveal that the natural logarithm of the Weibull size parameter can be determined with about the same precision as the Weibull modulus. For Weibull modulus estimation, a maximum likelihood method results

Dongfang Wu; Yongdan Li; Jianpo Zhang; Liu Chang; Dihua Wu; Zhiping Fang; Yahua Shi

2001-01-01

76

The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems. PMID:25321286

Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang

2014-08-25

77

Weibull Effective Area for Hertzian Ring Crack Initiation Stress

Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.

Jadaan, Osama M. [University of Wisconsin, Platteville; Wereszczak, Andrew A [ORNL; Johanns, Kurt E [ORNL

2011-01-01

78

Characteristic strength, Weibull modulus, and failure probability of fused silica glass

NASA Astrophysics Data System (ADS)

The development of high-energy lasers has focused attention on the requirement to assess the mechanical strength of optical components made of fused silica or fused quartz (SiO2). The strength of this material is known to be highly dependent on the stressed area and the surface finish, but has not yet been properly characterized in the published literature. Recently, Detrio and collaborators at the University of Dayton Research Institute (UDRI) performed extensive ring-on-ring flexural strength measurements on fused SiO2 specimens ranging in size from 1 to 9 in. in diameter and of widely differing surface qualities. We report on a Weibull statistical analysis of the UDRI data-an analysis based on the procedure outlined in Proc. SPIE 4375, 241 (2001). We demonstrate that (1) a two-parameter Weibull model, including the area-scaling principle, applies; (2) the shape parameter (m~=10) is essentially independent of the stressed area as well as the surface finish; and (3) the characteristic strength (1-cm2 uniformly stressed area) obeys a linear law, ?C (in megapascals) ~=160-2.83×PBS (in parts per million per steradian), where PBS characterizes the surface/subsurface ``damage'' of an appropriate set of test specimens. In this light, we evaluate the cumulative failure probability and the failure probability density of polished and superpolished fused SiO2 windows as a function of the biaxial tensile stress, for stressed areas ranging from 0.3 to 100 cm2.

Klein, Claude A.

2009-11-01

79

Tensile properties of short fiber composites with fiber strength distribution

The influence of fiber rupture, fiber pull-out and fiber tensile strength distribution on the post-cracking behavior of short-randomly-distributed fiber reinforced brittle-matrix composites has been analyzed using an approach based on the Weibull weakest-link statistics. The analysis led to the development of a predicting model for the composite bridging stress-crack opening displacement (sc - d) law—a fundamental material property necessary for

M. Maalej

2001-01-01

80

Incorporating product retirement in field performance reliability analysis

Statistical time-to-failure analysis is a very powerful and versatile analytical tool available to reliability engineers and statisticians for understanding and communicating the failure risk and reliability of a component, device, or system. The typical approach to characterizing time to failure involves fitting a parametric distribution, such as a Weibull probability function, using time series data on sales and records of

Ke Zhao; D. Steffey; J. Loud

2010-01-01

81

An empirical analysis of waiting times for price changes and orders in a financial market

NASA Astrophysics Data System (ADS)

We discuss an empirical analysis of waiting time distribution for price changes and orders in a financial market and its Weibull approximation. It is widely assumed that trades in financial markets occur independently and the waiting time distribution is exponential. However, recent empirical results [Raberto et al 2002, Scalas et al 2005 etc] of high frequency financial data show that the distribution is non-exponential. Therefore, in order to understand market behavior quantitatively and systematically, it is important to check the validity of the exponential distribution hypothesis and which non-exponential distribution is appropriate. In this talk, we analyze the waiting times of Sony bank USD/JPY rate and orders. We show that the waiting time distribution for not only price changes, but also orders, is non-exponential by using non-double auction market data. We also measure exactly how much better the Weibull distribution is as an approximation by using the Weibull paper and divergence measurements. Moreover, the estimated value of the shape parameter in Weibull distribution is similar in both price changes and orders waiting time distributions.

Sazuka, Naoya

2006-03-01

82

New distributional modelling approaches for gap analysis

Synthetic products based on biodiversity information such as gap analysis depend critically on accurate models of species' geographic distributions that simultaneously minimize error in both overprediction and omission. ...

Peterson, A. Townsend; Kluza, D. A.

2003-02-01

83

This paper analyses the use of a general probability distribution obtained through application of the maximum entropy principle (MEP), constrained by the low-order statistical moments of a given set of wind speed data, in the estimation of wind energy. For this purpose, a comparison is made between the two parameter Weibull distribution and the distributions obtained through the MEP. This

Penélope Ramírez; José Antonio Carta

2006-01-01

84

Distributed Clustering Using Collective Principal Component Analysis

This paper considers distributed clustering of high dimensional heterogeneous data using a distributed Principal Component Analysis (PCA) technique called the Collective PCA. It presents the Collective PCA technique which can be used independent of the clustering application. It shows a way to integrate the Collective PCA with a given off-the-shelf clustering algorithm in order to develop a distributed clustering technique.

Hillol Kargupta; Weiyun Huang; Krishnamoorthy Sivakumar; Erik L. Johnson

2001-01-01

85

Survival curves of heated bacterial spores:1 Effect of environmental factors on Weibull parameters2

1 Survival curves of heated bacterial spores:1 Effect of environmental factors on Weibull heat13 resistance for non-log linear survival curves. One simple model derived from the Weibull14 calculation.39 However in many cases the survival curves of heated bacteria do not present a log linear40

Brest, UniversitÃ© de

86

Fatigue and Reliability Analysis of Unidirectional GFRP Composites under Rotating Bending Loads

Rotating bending fatigue tests have been conducted on unidirectional glass fiber reinforced polyester (GFRP) composites. Standard test specimens were manufactured in form of circular rods with various fiber volume fraction (Vf) ratios. Failure modes of the composite rods have been examined using scanning electron microscope. The two-parameter Weibull distribution function was used to investigate the statistical analysis of the experimental

U. A. Khashaba

2003-01-01

87

A family of weakest link models for fiber strength distribution

It is well known that the most widely used distribution function for fiber tensile strength, the two-parameter Weibull distribution, does not always adequately describe the experimentally observed fiber strength scatter and the strength dependence on fiber length. To remedy this discrepancy, modifications of the Weibull distribution have been proposed that, while providing a good empirical fit to the strength data,

Yu. Paramonov; J. Andersons

2007-01-01

88

Influence analysis in truncated distributions

Conditional bias and asymptotic mean sensitivity curve (AMSC) are useful measures to assess the possible effect of an observation on an estimator when sampling from a parametric model. In this paper we obtain expressions for these measures in truncated distributions and study their theoretical properties. Specific results are given for the UMVUE of a parametric function. We note that the

I. Barranco-Chamorro; J. L. Moreno-Rebollo; J. M. Muñoz-Pichardo

2011-01-01

89

NSDL National Science Digital Library

This online, interactive lesson on special distributions provides examples, exercises, and applets covering normal, gamma, chi-square, student t, F, bivariate, normal, multivariate normal, beta, weibull, zeta, pareto, logistic, lognormal, and extreme value distributions. Overall, this lesson covers a plethora of topics, and for this reason, is a valuable resource.

Siegrist, Kyle

2008-12-24

90

Application of Weibull statistics to tensile strength prediction in laminated composites with open holes is revisited. Quasi-isotropic\\u000a carbon fiber laminates with two stacking sequences [45\\/0\\/?45\\/90]s and [0\\/45\\/90\\/?45]s with three different hole sizes of 2.54, 6.35 and 12.7 mm were considered for analysis and experimental examination. The\\u000a first laminate showed 20% lower strength for smaller and 10% for the larger hole sizes.

E. V. Iarve; D. Mollenhauer; T. J. Whitney; R. Kim

2006-01-01

91

Distributed Checkpointing: Analysis and Benchmarks

This work proposes a metric for the analysis and benchmarkingof checkpointing algorithms through simulation; the resultsobtained show that the metric is a good checkpoint overhead indicator. The metr ic is implemented by ChkSim, a simulator that has been used to compare 18 quasi-s ynchronous checkpointing algorithms. A survey of previous analyses of checkpointing shows our study to be the most

Gustavo M. D. Vieira; Luiz E. Buzato

92

Modified Weibull-derived spectrum for deep water significant wave height estimation

The modified Weibull spectrum is utilized to calculate the zeroth spectral moment (mo) using Monte Carlo integration methods. Then significant wave height (Hs) is calculated using the formula $$ {\\\\text{H}}_{\\\\text{s}} = 4\\\\sqrt {{\\\\text{m}}_{\\\\text{o}} } . $$ This is validated with observed buoy data and numerical wave model (WAM) predicted significant wave heights. The Weibull\\u000a parameters have been calculated using energy

G. Muraleedharan; Mourani Sinha; A. D. Rao; G. Latha; S. K. Dube

2009-01-01

93

Towards Distributed Memory Parallel Program Analysis

This paper presents a parallel attribute evaluation for distributed memory parallel computer architectures where previously only shared memory parallel support for this technique has been developed. Attribute evaluation is a part of how attribute grammars are used for program analysis within modern compilers. Within this work, we have extended ROSE, a open compiler infrastructure, with a distributed memory parallel attribute evaluation mechanism to support user defined global program analysis required for some forms of security analysis which can not be addressed by a file by file view of large scale applications. As a result, user defined security analyses may now run in parallel without the user having to specify the way data is communicated between processors. The automation of communication enables an extensible open-source parallel program analysis infrastructure.

Quinlan, D; Barany, G; Panas, T

2008-06-17

94

CRAB: Distributed analysis tool for CMS

NASA Astrophysics Data System (ADS)

CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

Sala, Leonardo; CMS Collaboration

2012-12-01

95

Assessing a Tornado Climatology from Global Tornado Intensity Distributions

Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if

Bernold Feuerstein; Nikolai Dotzek; Jürgen Grieser

2005-01-01

96

The analysis of SWIFT GRB redshift distribution.

NASA Astrophysics Data System (ADS)

At the beginning of 2008 the volume of SWIFT GRB set with known redshift consisted of approximately 100 bursts. Most part of these GRB are not a short bursts (˜ 92%). In this article the GRB redshift distribution for bursts with duration t90 > 2 s from SWIFT catalogue is presented and its shape is discussed. The shape of redshift distribution for uniform sources set in our Metagalaxy defined by properties of space which is Euclidean at small redshifts and cosmological parameters. As example of real uniform set the shape of normalized z-distribution for set of SNIa which were used for definition of ? and ? for our Metagalaxy and for first 604 QSO from 2QZ 6QZ catalogue are analyzed. It is reveal that 2-9% events in dependence of amount of sampling must be in the tails of distributions above 3? levels for significant one-peak fit of uniform set with confidence level 95-99%. But analysis of single peak approximation of GRB redshift distribution have shown that it has very heavy tail which consists of 37% of volume set and confidence level of this fit is ˜ 70%. Analysis of GRB z-distribution two-peaks fit gives characteristics redshifts z = 0.8 ± 0.1 and z = 2.8 ± 0.4 for two burst subgroups. Only (4 ± 2)% GRB are outside 3? level for this fit which correspond a combination of two uniform subsets. This approximation is more significant than single peak one. It confidence level is ˜ 95% and limited only by volume of GRB set with known redshift. So, it allows to make conclusion that SWIFT GRB sources set is not uniform and at least two subgroups could be separated in GRB redshift distribution for long GRB.

Arkhangelskaja, Irene

97

Effect of covariate omission in Weibull accelerated failure time model: A caution.

The accelerated failure time model is presented as an alternative to the proportional hazard model in the analysis of survival data. We investigate the effect of covariates omission in the case of applying a Weibull accelerated failure time model. In an uncensored setting, the asymptotic bias of the treatment effect is theoretically zero when important covariates are omitted; however, the asymptotic variance estimator of the treatment effect could be biased and then the size of the Wald test for the treatment effect is likely to exceed the nominal level. In some cases, the test size could be more than twice the nominal level. In a simulation study, in both censored and uncensored settings, Type I error for the test of the treatment effect was likely inflated when the prognostic covariates are omitted. This work remarks the careless use of the accelerated failure time model. We recommend the use of the robust sandwich variance estimator in order to avoid the inflation of the Type I error in the accelerated failure time model, although the robust variance is not commonly used in the survival data analyses. PMID:24895154

Gosho, Masahiko; Maruo, Kazushi; Sato, Yasunori

2014-11-01

98

Analysis and control of distributed cooperative systems.

As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

2004-09-01

99

EXPERIMENTAL DESIGN STRATEGY FOR THE WEIBULL DOSE RESPONSE MODEL (JOURNAL VERSION)

The objective of the research was to determine optimum design point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose response relationship follows the Weibull. The optimum design is dependent on the values of the We...

100

A review of integrated analysis of production–distribution systems

This paper reviews recent work on integrated analysis of production–distribution systems, and identifies important areas where further research is needed. By integrated analysis we understand analysis performed on models that integrate decisions of different production and distribution functions for a simultaneous optimization. We review work that explicitly considers the transportation system in the analysis, since we are interested in the

Ana Maria Sarmiento; Rakesh Nagi

1999-01-01

101

Modeling distributions of stem characteristics of genetically improved loblolly pine

succeeding mathematical expressions (Bailey and Dell 1973). The Weibull has also been adapted for predicting changes in the diameter distribution with stand age (Bailey 1980). The Weibull function has been utilized in several recent yield studies. Yield...-normal, gamma, and normal distributions, were fit to diameter and height data from even-aged pine stands. Hafley and Schreuder measured the flexibility of the six distributions in regard to their changes in shape through the use of the skewness coefficient...

Janssen, Jill Elizabeth

2012-06-07

102

Buffered Communication Analysis in Distributed Multiparty Sessions

NASA Astrophysics Data System (ADS)

Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

Deniélou, Pierre-Malo; Yoshida, Nobuko

103

Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India

NASA Astrophysics Data System (ADS)

The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.

Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.

2014-09-01

104

Group reaction time distributions and an analysis of distribution statistics

Describes a method of obtaining an average reaction time (RT) distribution for a group of Ss. The method is particularly useful for cases in which data from many Ss are available but there are only 10–20 RT observations per S cell. Essentially, RTs for each S are organized in ascending order, and quantiles are calculated. The quantiles are then averaged

Roger Ratcliff

1979-01-01

105

Holistic schedulability analysis for distributed hard real-time systems

This paper extends the current analysis associated with static priority pre-emptive based scheduling to address the wider problem of analysing schedulability of a distributed hard real-time system; in particular it derives analysis for a distributed system where tasks with arbitrary deadlines communicate by message passing and shared data areas. A simple TDMA protocol is assumed, and analysis developed to bound

Ken Tindell; John Clark

1994-01-01

106

It has been shown in [2] that the strength of Aramide fibers can be described with a sufficient degree of accuracy by means of the statistical theory of strength [3]. Weibull's distribution was taken as the distribution function for the strength of monofibers, It was found that Weibull's function can be unimodal or bimodal, depending on the structure of the

L. V. Kompaniets; V. V. Potapov; G. A. Grigoryan; A. M. Kuperman; L. V. Puchkov; É. S. Zelenskii; É. V. Prut; N. S. Enikolopyan

1983-01-01

107

Time-dependent reliability analysis of ceramic engine components

NASA Technical Reports Server (NTRS)

The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

Nemeth, Noel N.

1993-01-01

108

PERFORMANCE ANALYSIS OF DISTRIBUTED DATA BASE SYSTEMS

the design of a distributed relational data base system. Then, we dis- cuss experimental observations distribute E at Berkeley where E.dept = "shoe" at Paris where E.dept = "toy" at Boston where E.dept != "toy" and E.dept != "shoe" Berkeley, Paris and Boston are logical names of machines which are mapped to site

California at Irvine, University of

109

Functional Analysis of Manufacturing Execution System Distribution

Manufacturing Execution Systems are essential elements for vertical integration in modern automation systems because they provide the link between the centralized enterprise level and the distributed shop floor. Existing centralized MES implementations can however not properly cope with an unpre- dictable order flow, changes on the shop floor, or the increasing complexity of both. Distribution of the MES functionalities would

Aleksey Bratukhin; Thilo Sauter

2011-01-01

110

Shark: Fast Data Analysis Using Coarse-grained Distributed Memory

Shark: Fast Data Analysis Using Coarse-grained Distributed Memory Cliff Engle, Antonio Lupher {cengle, alupher, rxin, matei, franklin, shenker, istoica}@cs.berkeley.edu ABSTRACT Shark is a research data analysis system built on a novel coarse-grained distributed shared-memory abstraction. Shark

California at Irvine, University of

111

Automatically acquiring phrase structure using distributional analysis

In this paper, we present evidence that the acquisition of the phrase structure of a natural language is possible without supervision and with a very small initial grammar. We describe a language learner that extracts distributional information from a corpus annotated with parts of speech and is able to use this extracted information to accurately parse short sentences. The phrase

Eric Brill; Mitchell Marcus

1992-01-01

112

A Framework for Callosal Fiber Distribution Analysis

This paper presents a framework for analyzing the spatial distribution of neural fibers in the brain, with emphasis on interhemispheric fiber bundles crossing through the corpus callosum. The proposed approach combines methodologies for fiber tracking and spatial normalization and is applied on diffusion tensor images and standard magnetic resonance images.

Dongrong Xu; Susumu Mori; Meiyappan Solaiyappan; Peter C. M. van Zijl; Christos Davatzikos

2002-01-01

113

DISTRIBUTION SYSTEM RELIABILITY ANALYSIS USING A MICROCOMPUTER

Distribution system reliability for most utilities is maintained by the knowledge of a few key personnel. Generally, these water maintenance personnel use a good memory, repair records, a large wall map and a hydraulic model of the larger transmission mains to help identify probl...

114

Economic analysis of efficient distribution transformer trends

This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

1998-03-01

115

Statistical distribution analysis of rubber fatigue data

NASA Astrophysics Data System (ADS)

Average rubber fatigue resistance has previously been related to such factors as elastomer type, cure system, cure temperature, and stress history. This paper extends this treatment to a full statistical analysis of rubber fatigue data. Analyses of laboratory fatigue data are used to predict service life. Particular emphasis is given to the prediction of early tire splice failures, and to adaptations of statistical fatigue analysis for the particular service conditions of the rubber industry.

DeRudder, J. L.

1981-10-01

116

Mechanical Network in Titin Immunoglobulin from Force Distribution Analysis

Mechanical Network in Titin Immunoglobulin from Force Distribution Analysis Wolfram Stacklies1. , M, Stuttgart, Germany Abstract The role of mechanical force in cellular processes is increasingly revealed force propagates within proteins determines their mechanical behavior yet remains largely unknown. We

GrÃ¤ter, Frauke

117

WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

The user?s guide entitled ?Water Distribution System Analysis: Field Studies, Modeling and Management? is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

118

TITLE: PARTICULATE SIZE DISTRIBUTIONS AND SEQUENTIAL FRAGMENTATION\\/TRANSPORT THEORY

Particulate size distributions offer the scientist impor- tant clues about the mechanism(s) responsible for their formation. These distributions are complex, often being the combination of several subpopulations that bear the signa- ture of specific fragmentation and particulate transport processes. Historically, such distributions have been char- acterized by some empirical distribution law such as the lognormal or Weibull distributions, but such

Wilbur Brown

119

The Distribution Analysis of the Inflation Components of Turkey

This paper investigates distribution of inflation items using various measures of skewness and kurtosis for Turkey covering the period 1996-2007. Considering sensitivity of traditional distribution measures to outlying observations robust skewness and kurtosis are also computed as a novelty. Analysis results mainly reveal that inflation components are right skewed and fat tailed as documented by the previous studies. However due

A. Nazif Catik; A. Özlem Önder

2009-01-01

120

A Framework for Callosal Fiber Distribution Analysis Dongrong Xu,*,

A Framework for Callosal Fiber Distribution Analysis Dongrong Xu,*, Susumu Mori,, Meiyappan This paper presents a framework for analyzing the spatial distribution of neural fibers in the brain, with emphasis on interhemispheric fiber bundles crossing through the corpus callosum. The proposed approach

121

The Complexity of Distributed Collaborative Learning: Unit of Analysis

collaborative learning is a product of complex interconnections between several aspects, such as: theoriesThe Complexity of Distributed Collaborative Learning: Unit of Analysis Annita Fjuk, Telenor R of this paper is manifested in the new conditions that characterise distributed collaborative learning. The core

Boyer, Edmond

122

An Information-Theoretic Analysis of Distributed Resource Allocation

An Information-Theoretic Analysis of Distributed Resource Allocation Tansu Alpcan and Subhrakanti between the system and its users. This information exchange is, however, limited by communication of information (flow) in a well-known distributed resource allocation algorithm using concepts from Shannon

Alpcan, Tansu

123

Analysis statistical fluctuation for quantum key distribution system

Every security analysis of quantum key distribution relies on a faithful modeling of the employed quantum states. The properties of decoy state protocol based on only two decoy states and one signal state are analyzed. The performance of decoy state quantum key distribution protocol with finite resources is calculated by considering the statistical fluctuation for the yield and error rate

Jiao Rong-zhen; Han Qing-yao; Tang Shao-jie

2010-01-01

124

Software component quality has a major influence in software development project performances such as lead-time, time to market and cost. It also affects the other projects within the organization, the people assigned into the projects and the organization in general. Software development organization must have indication and prediction about software component quality and project performances in general. One of the

Lovre Hribar

2009-01-01

125

The aim of this research was to study the behaviour of the drying kinetics of pepino fruit (Solanum muricatum Ait.) at five temperatures (50, 60, 70, 80 and 90 °C). In addition, desorption isotherms were determined at 20, 40 and 60 °C\\u000a over a water activity range from 0.10 to 0.90. The Guggenheim, Anderson and de Boer model was suitable to depict

Elsa Uribe; Antonio Vega-Gálvez; Karina Di Scala; Romina Oyanadel; Jorge Saavedra Torrico; Margarita Miranda

126

QMPE: Estimating Lognormal, Wald and Weibull RT distributions with a parameter

.cousineau@umontreal.ca #12;QMPE: Penultimate draft, in press, Behavior Research Methods, Instruments and Computers, 2004 We with parameter dependent lower bounds. 1 #12;QMPE: Penultimate draft, in press, Behavior Research Methods #12;QMPE: Penultimate draft, in press, Behavior Research Methods, Instruments and Computers, 2004

Cousineau, Denis

127

Likelihood analysis of earthquake focal mechanism distributions

In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad-hoc, empirical assumptions, thus their performance is questionable. In this work we apply a conventional likelihood method to measure a skill of forecast. The advantage of such an approach is that earthquake rate prediction can in principle be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random. For double-couple source orientation the random probability distribution function is not uniform, which complicates the calculation of the likelihood value. To better understand the resulting complexities we calculate the information (likelihood) score for two rota...

Kagan, Y Y

2014-01-01

128

Distribution System Analysis Tools for Studying High Penetration of PV

. In order to understand and analyze the impact of high penetration of inverter-interfaced PVDistribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV

129

Credibility in Context: An Analysis of Feature Distributions in Twitter

Credibility in Context: An Analysis of Feature Distributions in Twitter John O'Donovan, Byungkyu@cs.rpi.edu Abstract--Twitter is a major forum for rapid dissemination of user-provided content in real time. As such focus on an analysis that highlights the utility of the individual features in Twitter such as hashtags

Hollerer, Tobias

130

Building finite element analysis programs in distributed services environment

Traditional finite element analysis (FEA) programs are typically built as stand-alone desktop software. This paper describes an Internet-enabled framework that facilitates building a FEA program as distributed web services. The framework allows users easy access to the FEA core service and the analysis results by using a web browser or other application programs. In addition, the framework enables new as

Jun Peng; Kincho H. Law

2004-01-01

131

Complex network analysis of water distribution systems

This paper explores a variety of strategies for understanding the formation, structure, efficiency and vulnerability of water distribution networks. Water supply systems are studied as spatially organized networks for which the practical applications of abstract evaluation methods are critically evaluated. Empirical data from benchmark networks are used to study the interplay between network structure and operational efficiency, reliability and robustness. Structural measurements are undertaken to quantify properties such as redundancy and optimal-connectivity, herein proposed as constraints in network design optimization problems. The role of the supply-demand structure towards system efficiency is studied and an assessment of the vulnerability to failures based on the disconnection of nodes from the source(s) is undertaken. The absence of conventional degree-based hubs (observed through uncorrelated non-heterogeneous sparse topologies) prompts an alternative approach to studying structural vulnerability based on the identification of network cut-sets and optimal connectivity invariants. A discussion on the scope, limitations and possible future directions of this research is provided.

A. Yazdani; P. Jeffrey

2011-04-01

132

A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a\\u000a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis\\u000a of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential\\u000a kernel has enabled the transformation of the

Karmeshu; Varun Gupta; K. V. Kadambari

2011-01-01

133

Modeling and analysis of solar distributed generation

NASA Astrophysics Data System (ADS)

Recent changes in the global economy are creating a big impact in our daily life. The price of oil is increasing and the number of reserves are less every day. Also, dramatic demographic changes are impacting the viability of the electric infrastructure and ultimately the economic future of the industry. These are some of the reasons that many countries are looking for alternative energy to produce electric energy. The most common form of green energy in our daily life is solar energy. To convert solar energy into electrical energy is required solar panels, dc-dc converters, power control, sensors, and inverters. In this work, a photovoltaic module, PVM, model using the electrical characteristics provided by the manufacturer data sheet is presented for power system applications. Experimental results from testing are showed, verifying the proposed PVM model. Also in this work, three maximum power point tracker, MPPT, algorithms would be presented to obtain the maximum power from a PVM. The first MPPT algorithm is a method based on the Rolle's and Lagrange's Theorems and can provide at least an approximate answer to a family of transcendental functions that cannot be solved using differential calculus. The second MPPT algorithm is based on the approximation of the proposed PVM model using fractional polynomials where the shape, boundary conditions and performance of the proposed PVM model are satisfied. The third MPPT algorithm is based in the determination of the optimal duty cycle for a dc-dc converter and the previous knowledge of the load or load matching conditions. Also, four algorithms to calculate the effective irradiance level and temperature over a photovoltaic module are presented in this work. The main reasons to develop these algorithms are for monitoring climate conditions, the elimination of temperature and solar irradiance sensors, reductions in cost for a photovoltaic inverter system, and development of new algorithms to be integrated with maximum power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

Ortiz Rivera, Eduardo Ivan

134

Data exploration in meta-analysis with smooth latent distributions.

Meta-analysis with discrete outcomes is interpreted as the estimation (in one or two dimensions) of a non-parametric smooth latent distribution of event probabilities (or rates). A simple but efficient EM algorithm is presented. A fine grid is used and fast smoothing is done by penalized least squares. Data exploration is the primary goal, but the estimated distribution can also be used to compute useful statistics of treatment effects. PMID:17216668

Eilers, Paul H C

2007-07-30

135

Solar-cell design based on a distributed diode analysis

The front surface of a p-n junction solar cell has resistive losses associated with the diffused layer, the metal-semiconductor contact, and the grid structure. These losses are analyzed by considering the spatially distributed nature of the p-n junction and the grid conductors. This distributed diode analysis is especially useful for solar cells operated under concentrated sunlight conditions. The results show

J. L. Boone; T. P. van Doren

1978-01-01

136

Feature extraction from interferograms for phase distribution analysis

NASA Astrophysics Data System (ADS)

In several applications of interferogram analysis, e.g. automated nondestructive testing, it is necessary to detect irregular interference phase distributions or to compare interference phase distributions with each other. For that purpose it is useful to represent the essential information of phase distributions by characteristic features. We propose features which can be extracted both from interferograms as well as from phase distributions. For feature extraction we developed new image processing methods analyzing the local structure of gray-level images. The feature extraction is demonstrated with examples of a cantilever beam and a pressure vessel using holographic interferometry. Finally we show the use of the features for defect detection and phase distribution comparison.

Merz, Torsten; Paulus, Dietrich W.; Niemann, Heinrich

1998-06-01

137

Concerning the glass fiber strength distribution function

The statistical strength distribution functions of glass monofilaments of various composition are considered. On the basis of the experimental data it is shown that the fiber strength distributions can be described by a three-parameter function of the Weibull type.

S. L. Roginskii; V. S. Strelyaev; L. L. Sachkovskaya

1970-01-01

138

Requirements Analysis of Distribution in Databases for Telecommunications

\\u000a Current and future telecommunication systems will rely heavily on some key IT baseline technologies including databases. We\\u000a present our analysis of distribution requirements in databases for telecommunications. We discuss those needs in GSM and IN\\u000a CS-2. Our requirements analysis indicates that the following are important issues in databases for telecommunications: single-node\\u000a read and temporal transactions, clustered read and temporal transactions,

Juha Taina; Kimmo E. E. Raatikainen

1999-01-01

139

GIS-based poverty and population distribution analysis in China

NASA Astrophysics Data System (ADS)

Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

Cui, Jing; Wang, Yingjie; Yan, Hong

2009-07-01

140

Energy loss analysis of an integrated space power distribution system

NASA Technical Reports Server (NTRS)

The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

Kankam, M. David; Ribeiro, P. F.

1992-01-01

141

Analysis of a Realistic Fault Model Large Distributed Systems

will be transformed into a singular set of relations and tables so that their convergence rate and fault tolerance canAnalysis of a Realistic Fault Model for Large Distributed Systems M.H. Azadmanesh 1 A.W. Krings 2 for the convergence rate and fault tolerance of a broad family of convergent voting algorithms called Mean

Krings, Axel W.

142

Analysis of Hawaii Biomass Energy Resources for Distributed Energy Applications

Analysis of Hawaii Biomass Energy Resources for Distributed Energy Applications Prepared for State) concentrations on a unit energy basis for sugar cane varieties and biomass samples of Tables Table 1-A. Analyses of biomass materials found in the State of Hawaii

143

Water quality based reliability analysis for water distribution networks

The research work in the past two decades on reliability analysis of water distribution networks (WDNs) is primarily focused on assessing the reliability from the point of view of hydraulics, that is, supplying the required quantity of water at the desired pressure, using various qualitative and quantitative measures. A minimum desired level of residual chlorine at consumer's tap is needed

Rajesh Gupta; Sushma Dhapade; Soumitra Ganguly; Pramod R. Bhave

2012-01-01

144

Shark: Fast Data Analysis Using Coarse-grained Distributed Memory.

National Technical Information Service (NTIS)

Shark is a research data analysis system built on a novel coarse- grained distributed shared-memory abstraction. Shark marries query processing with deep data analy- sis, providing a uni ed system for easy data manipulation using SQL and pushing sophistic...

C. Engle

2013-01-01

145

Analysis of particle distribution according to sizes with improved resolution

A new method of analysis of emulsions is proposed with the aid of light scattering, in which it is possible to separate in the scattered signal the contributions of the particles of different sizes and thus to determine the particle distribution according to the sizes with improved accuracy. The effect is achieved by introducing additionally an ultrasonic beam into the

Vladimir Rysakov; Feliks Rejmund

2004-01-01

146

Data synthesis and display programs for wave distribution function analysis

NASA Technical Reports Server (NTRS)

At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

Storey, L. R. O.; Yeh, K. J.

1992-01-01

147

Rapid Analysis of Mass Distribution of Radiation Shielding

NASA Technical Reports Server (NTRS)

Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

Zapp, Edward

2007-01-01

148

Analyzing Distributed Functions in an Integrated Hazard Analysis

NASA Technical Reports Server (NTRS)

Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

Morris, A. Terry; Massie, Michael J.

2010-01-01

149

Parametrizing the local dark matter speed distribution: A detailed analysis

NASA Astrophysics Data System (ADS)

In a recent paper, a new parametrization for the dark matter (DM) speed distribution f(v) was proposed for use in the analysis of data from direct detection experiments. This parametrization involves expressing the logarithm of the speed distribution as a polynomial in the speed v. We present here a more detailed analysis of the properties of this parametrization. We show that the method leads to statistically unbiased mass reconstructions and exact coverage of credible intervals. The method performs well over a wide range of DM masses, even when finite energy resolution and backgrounds are taken into account. We also show how to select the appropriate number of basis functions for the parametrization. Finally, we look at how the speed distribution itself can be reconstructed, and how the method can be used to determine if the data are consistent with some test distribution. In summary, we show that this parametrization performs consistently well over a wide range of input parameters and over large numbers of statistical ensembles and can therefore reliably be used to reconstruct both the DM mass and speed distribution from direct detection data.

Kavanagh, Bradley J.

2014-04-01

150

Area and Flux Distributions of Active Regions, Sunspot Groups, and Sunspots: A Multi-Database Study

In this work we take advantage of eleven different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions -- where a pure Weibull (log-normal) characterizes the distribution of structures with fluxes below (above) $10^{21}$Mx ($10^{22}$Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behaviour of a power-law distribution (when extended into smaller fluxes), making our results compatible with the results of Parnell et al.\\ (200...

Muñoz-Jaramillo, Andrés; Windmueller, John C; Amouzou, Ernest C; Longcope, Dana W; Tlatov, Andrey G; Nagovitsyn, Yury A; Pevtsov, Alexei A; Chapman, Gary A; Cookson, Angela M; Yeates, Anthony R; Watson, Fraser T; Balmaceda, Laura A; DeLuca, Edward E; Martens, Petrus C H

2014-01-01

151

Spatial Distribution Analysis of Scrub Typhus in Korea

Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does not directly affect the incidence rate. Conclusion: GIS analysis shows the spatial characteristics of scrub typhus. This research can be used to construct a spatial-temporal model to understand the epidemic tsutsugamushi. PMID:24159523

Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

2013-01-01

152

HammerCloud: A Stress Testing System for Distributed Analysis

NASA Astrophysics Data System (ADS)

Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

2011-12-01

153

A comprehensive study of distribution laws for the fragments of Ko\\v{s}ice meteorite

In this study, we conduct a detailed analysis of the Ko\\v{s}ice meteorite fall (February 28, 2010), in order to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Ko\\v{s}ice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady and bimodal lognormal distributions are found to be the most appropriate for describing the Ko\\v{s}ice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Ko\\v{s}ice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 kg and 9 kg respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the...

Gritsevich, Maria; Kohout, Tomáš; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

2014-01-01

154

A comprehensive study of distribution laws for the fragments of Košice meteorite

NASA Astrophysics Data System (ADS)

In this study, we conduct a detailed analysis of the Košice meteorite fall (February 28, 2010), to derive a reliable law describing the mass distribution among the recovered fragments. In total, 218 fragments of the Košice meteorite, with a total mass of 11.285 kg, were analyzed. Bimodal Weibull, bimodal Grady, and bimodal lognormal distributions are found to be the most appropriate for describing the Košice fragmentation process. Based on the assumption of bimodal lognormal, bimodal Grady, bimodal sequential, and bimodal Weibull fragmentation distributions, we suggest that, prior to further extensive fragmentation in the lower atmosphere, the Košice meteoroid was initially represented by two independent pieces with cumulative residual masses of approximately 2 and 9 kg, respectively. The smaller piece produced about 2 kg of multiple lightweight meteorite fragments with the mean around 12 g. The larger one resulted in 9 kg of meteorite fragments, recovered on the ground, including the two heaviest pieces of 2.374 kg and 2.167 kg with the mean around 140 g. Based on our investigations, we conclude that two to three larger fragments of 500-1000 g each should exist, but were either not recovered or not reported by illegal meteorite hunters.

Gritsevich, Maria; Vinnikov, Vladimir; Kohout, TomáÅ.¡; Tóth, Juraj; Peltoniemi, Jouni; Turchak, Leonid; Virtanen, Jenni

2014-03-01

155

Thermal analysis of directly buried conduit heat-distribution systems

The calculations of heat losses and temperature field for directly buried conduit heat distribution systems were performed using the finite element computer programs. The finite element analysis solved two-dimensional, steady-state heat transfer problems involving two insulated parallel pipes encased in the same conduit casing and in separate casings, and the surrounding earth. Descriptions of the theoretical basis, computational scheme, and the data input and outputs of the developed computer programs are presented. Numerical calculations were carried out for predicting the temperature distributions within the existing high temperature hot water distribution system and two insulated pipes covered in the same metallic conduit and the surrounding soil. The predicted results generally agree with the experimental data obtained at the test site.

Fang, J.B.

1990-08-01

156

Vibration analysis of distributed-lumped rotor systems

In this paper a distributed-lumped model for the analysis of the flexural vibrations of a rotor-bearing system is considered. A general formula for the determinant of the tri-diagonal partitioned matrix description of the system is derived. This enables the irrational characteristic determinant of the system model to be obtained by the dynamic stiffness matrix method. The results obtained are compared

M. Aleyaasin; M. Ebrahimi; R. Whalley

2000-01-01

157

Distributive analysis of rural land size and price relationships

12 Market Movements Frictional Adjustments Economies of Size 12 1". 13 Factors of Consideration Market Division Methodology Bases Percentages Blocks Groups Appreciation Market Distribution 15 17 18 19 20 20 23 24 The Models... Chapter Page VI. STATISTICAL ANALYSIS Sale Price 39 39 General Price Levels Significance Related to Blocks Appreciation Rates 39 39 44 Acres 50 General Price Levels Significance Related to Blocks Appreciation Rates 50 56 56 VII. SUMMARY...

Rothe, Robert Joseph

2012-06-07

158

Distribution System Reliability Analysis for Smart Grid Applications

NASA Astrophysics Data System (ADS)

Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

Aljohani, Tawfiq Masad

159

Reliability assessment of aged electrical components in the presence of over-stresses (e.g. voltage surges) is not an easy task. In this paper, a new methodology for solving this problem is illustrated, which is based on a Bayesian approach applied to a novel Weibull stress-strength probabilistic model. This model holds, under proper simplifying hypotheses, for electrical components progressively degraded by service

E. Chiodo; G. Mazzanti

2006-01-01

160

PERISCOPE: An Online-Based Distributed Performance Analysis Tool

NASA Astrophysics Data System (ADS)

This paper presents PERISCOPE - an online distributed performance analysis tool that searches for a wide range of performance bottlenecks in parallel applications. It consists of a set of agents that capture and analyze application and hardware-related properties in an autonomous fashion. The paper focuses on the Periscope design, the different search methodologies, and the steps involved to do an online performance analysis. A new graphical user-friendly interface based on Eclipse is introduced. Through the use of this new easy-to-use graphical interface, remote execution, selection of the type of analysis, and the inspection of the found properties can be performed in an intuitive and easy way. In addition, a real-world application, namely, the GENE code, a grand challenge problem of plasma physics is analyzed using Periscope. The results are illustrated in terms of found properties and scalability issues.

Benedict, Shajulin; Petkov, Ventsislav; Gerndt, Michael

161

Scalable Visual Reasoning: Supporting Collaboration through Distributed Analysis

We present a visualization environment called the Scalable Reasoning System (SRS) that provides a suite of tools for the collection, analysis, and dissemination of reasoning products. This environment is designed to function across multiple platforms, bringing the display of visual information and the capture of reasoning associated with that information to both mobile and desktop clients. The service-oriented architecture of SRS promotes collaboration and interaction between users regardless of their location or platform. Visualization services allow data processing to be centralized and analysis results collected from distributed clients in real time. We use the concept of “reasoning artifacts” to capture the analytic value attached to individual pieces of information and collections thereof, helping to fuse the foraging and sense-making loops in information analysis. Reasoning structures composed of these artifacts can be shared across platforms while maintaining references to the analytic activity (such as interactive visualization) that produced them.

Pike, William A.; May, Richard A.; Baddeley, Bob; Riensche, Roderick M.; Bruce, Joe; Younkin, Katarina

2007-05-21

162

Clustering analysis of seismicity and aftershock identification.

We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)10.1103/PhysRevE.69.066106] based on the space-time-magnitude nearest-neighbor distance eta between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance eta has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of eta is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California. PMID:18764159

Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry

2008-07-01

163

Clustering Analysis of Seismicity and Aftershock Identification

We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)] based on the space-time-magnitude nearest-neighbor distance {eta} between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance {eta} has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of {eta} is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California.

Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry [Department of Mathematics and Statistics, University of Nevada, Reno, Nevada 89557-0084 (United States); Departments of Mathematics and Earth and Atmospheric Sciences, Purdue University, West Lafayette, Indiana 47907-1395 (United States); Institute of Geophysics and Planetary Physics, and Department of Earth and Space Sciences, University of California Los Angeles, 3845 Slichter Hall, Los Angeles, California 90095-1567 (United States)

2008-07-04

164

Crystal size distribution analysis of plagioclase in experimentally decompressed hydrous rhyodacite November 2010 Editor: T.M. Harrison Keywords: crystal size distribution plagioclase decompression experiments growth rate nucleation rate residence time This study presents crystal size distributions (CSD

Hammer, Julia Eve

165

Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

NASA Astrophysics Data System (ADS)

Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ?-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more insight into parameter sensitivity and the conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization provide an alternative way for future MOBIDIC modeling.

Yang, J.; Castelli, F.; Chen, Y.

2014-10-01

166

Time series power flow analysis for distribution connected PV generation.

Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating potential PV impacts.

Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J. [Georgia Institute of Technology, Atlanta, GA; Smith, Jeff [Electric Power Research Institute, Knoxville, TN; Dugan, Roger [Electric Power Research Institute, Knoxville, TN

2013-01-01

167

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Harrington, Jim W [Los Alamos National Laboratory; Rice, Patrick R [Los Alamos National Laboratory

2008-01-01

168

Numerical analysis of decoy state quantum key distribution protocols

Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

Patrick Rice; Jim Harrington

2008-12-30

169

[Ordination analysis on relationship between bryophyte distribution and climatic factors].

Based on the data of climate and bryoflora in 21 mountainous regions of China, 61 moss families, 23 genera of Dicranaceae, 17 species of genus Campylopus and 35 species of genus Dicranum were analyzed by Canonical Correspond Analysis(CCA) and Detrended Canonical Correspond Analysis (DDCA) to reveal their distribution relationships with nine climatic factors, including annual average temperature, January average temperature, July average temperature, annual average rainfall, annual average fog days, annual average frost days and annual average light hours. The similarity of geographical elements among nine mountains in China and their relationships with climatic factors were also analyzed. The methods of applying DDCA and CCA to analyze the relationships between bryophyte and climatic factors were thus introduced. The studies indicate that CCA and DCCA are applicable in florology and phytogeography. PMID:11767521

Cao, T; Guo, S; Gao, C

2000-10-01

170

Analysis of Fuel Ethanol Transportation Activity and Potential Distribution Constraints

This paper provides an analysis of fuel ethanol transportation activity and potential distribution constraints if the total 36 billion gallons of renewable fuel use by 2022 is mandated by EPA under the Energy Independence and Security Act (EISA) of 2007. Ethanol transport by domestic truck, marine, and rail distribution systems from ethanol refineries to blending terminals is estimated using Oak Ridge National Laboratory s (ORNL s) North American Infrastructure Network Model. Most supply and demand data provided by EPA were geo-coded and using available commercial sources the transportation infrastructure network was updated. The percentage increases in ton-mile movements by rail, waterways, and highways in 2022 are estimated to be 2.8%, 0.6%, and 0.13%, respectively, compared to the corresponding 2005 total domestic flows by various modes. Overall, a significantly higher level of future ethanol demand would have minimal impacts on transportation infrastructure. However, there will be spatial impacts and a significant level of investment required because of a considerable increase in rail traffic from refineries to ethanol distribution terminals.

Das, Sujit [ORNL; Peterson, Bruce E [ORNL; Chin, Shih-Miao [ORNL

2010-01-01

171

We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution

Naoya Sazuka

2007-01-01

172

Support vector machine optimization via margin distribution analysis

NASA Astrophysics Data System (ADS)

Support Vector Machines (SVMs) have generated excitement and interest in the pattern recognition community due to their generalization, performance, and ability to operate in high dimensional feature spaces. Although SVMs are generated without the use of user-specified models, required hyperparameters, such as Gaussian kernel width, are usually user-specified and/or experimentally derived. This effort presents an alternative approach for the selection of the Gaussian kernel width via analysis of the distributional characteristics of the training data projected on the 'trained' SVM (margin values). The efficacy of a particular kernel width can be visually determined via one-dimensional density estimate plots of the training data margin values. Projecting the data onto the SVM hyperplane allows the one-dimensional analysis of the data from the viewpoint of the 'trained' SVM. The effect of kernel parameter selection on class-conditional margin distributions is demonstrated in the one-dimensional projection subspace, and a criterion for unsupervised optimization of kernel width is discussed. Empirical results are given for two classification problems: the 'toy' checkerboard problem and a high dimensional classification problem using simulated High-Resolution Radar (HRR) targets projected into a wavelet packet feature space.

Waagen, Donald; Cassabaum, Mary; Schmitt, Harry A.; Pollock, Bruce

2003-09-01

173

CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES

Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably well developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.

S. Bandopadhyay; N. Nagabhushana

2003-10-01

174

A meta-analysis of parton distribution functions

NASA Astrophysics Data System (ADS)

A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+? s uncertainty at a common QCD coupling strength of 0.118.

Gao, Jun; Nadolsky, Pavel

2014-07-01

175

A meta-analysis of parton distribution functions

A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+$\\alpha_s$ uncertainty at a common QCD coupling strength of 0.118.

Jun Gao; Pavel Nadolsky

2013-12-30

176

A Distributed Flocking Approach for Information Stream Clustering Analysis

Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

2006-01-01

177

Computing statistical properties of hue distributions for color image analysis

NASA Astrophysics Data System (ADS)

Color images can be analyzed using two kinds of coordinate systems: rectangular systems based on primary colors (RGB), and cylindrical systems based on hue, saturation, and intensity (HSI). HSI systems match our intuitive understanding of colors and make it possible to name colors in knowledge bases, a significant advantage given the mushrooming use of declarative knowledge for image analysis. On the other hand, HSI systems give rise to singularities which result in undesirable instabilities, notably with respect to the statistical properties of hue distributions. Computing the mean and variance of a split distribution in the conventional manner would yield an unrealistically large variance and a mean hue in the blue-green region. The paper presents alternative ways of computing means and variances that avoid these effects. At the cost of a relatively slight numerical overhead, these computations generate results in agreement with our intuitive understanding of colors in split peak situations, and reduce to the standard definitions in well-behaved histograms. Recursive formulas are given for the calculation of these statistics, and an efficient algorithm is presented. Equivalence conditions between the results of the introduced procedures and conventional calculations are stated. Examples are given using actual color images.

Crevier, Daniel

1993-08-01

178

Pattern analysis and spatial distribution of neurons in culture.

The nervous system is a complex, highly-ordered, integrated network of cells. Dispersed cultures of neurons enable investigations into intrinsic cellular functions without the complexities inherent in the intact nervous system. This culture process generates a homogeneously dispersed population that is assumed to be spatially random. Despite the vast number of studies utilizing dispersed neurons, few studies address the spatial distribution of large populations of neurons, in vitro. We used ink-jet printing and surface chemistry to define patterned areas of poly-lysine adhesion (?50 ?m spots) juxtaposed against a fluorinated-silane background. We quantitatively analysed populations of patterned neurons on printed protein spots, and unpatterned neurons. Using a microarray scanner, we acquired large images (72 mm × 22 mm) of patterns, and neurons with and without patterns. Fast Fourier transformation (FFT) image analysis was used to determine global alignment of neurons to patterns. Through point pattern analysis, we described the spatial organization of dispersed neurons with, or without, patterned substrates. Patterned neurons show spatial organization characteristics reminiscent of printed patterns, with spatial distributions representative of unpatterned neurons. Most notably, both patterned and unpatterned neurons show departure from null models of complete spatial randomness (CSR; a homogeneous Poisson process) at shorter distances with conformity to CSR occurring at longer distances. Cellular morphometrics show that when compared to their unpatterned counterparts, spot-patterned neurons exhibit a significant increase (p < 0.0001) in the mean dendritic circularity and an increase in the number of more circular neurons. Through neurite tracing, we show that dendritic processes are also highly confined to patterned areas, and that they are on average 58% shorter than dendrites of neurons without patterns. Our findings show that patterned areas change the spatial organization of the somata and dendrites of cultured neurons, and that traditional neuronal cultures deviate from CSR. PMID:22057472

Millet, Larry J; Collens, Mitchell B; Perry, George L W; Bashir, Rashid

2011-12-01

179

NSDL National Science Digital Library

This online, interactive lesson on distributions provides examples, exercises, and applets which explore the basic types of probability distributions and the ways distributions can be defined using density functions, distribution functions, and quantile functions.

Siegrist, Kyle

2008-12-24

180

An analysis of confidence limit calculations used in AAPM Task Group No. 119

Purpose: The report issued by AAPM Task Group No. 119 outlined a procedure for evaluating the effectiveness of IMRT commissioning. The procedure involves measuring gamma pass-rate indices for IMRT plans of standard phantoms and determining if the results fall within a confidence limit set by assuming normally distributed data. As stated in the TG report, the assumption of normally distributed gamma pass rates is a convenient approximation for commissioning purposes, but may not accurately describe the data. Here the authors attempt to better describe gamma pass-rate data by fitting it to different distributions. The authors then calculate updated confidence limits using those distributions and compare them to those derived using TG No. 119 method. Methods: Gamma pass-rate data from 111 head and neck patients are fitted using the TG No. 119 normal distribution, a truncated normal distribution, and a Weibull distribution. Confidence limits to 95% are calculated for each and compared. A more general analysis of the expected differences between the TG No. 119 method of determining confidence limits and a more time-consuming curve fitting method is performed. Results: The TG No. 119 standard normal distribution does not fit the measured data. However, due to the small range of measured data points, the inaccuracy of the fit has only a small effect on the final value of the confidence limits. The confidence limits for the 111 patient plans are within 0.1% of each other for all distributions. The maximum expected difference in confidence limits, calculated using TG No. 119's approximation and a truncated distribution, is 1.2%. Conclusions: A three-parameter Weibull probability distribution more accurately fits the clinical gamma index pass-rate data than the normal distribution adopted by TG No. 119. However, the sensitivity of the confidence limit on distribution fit is low outside of exceptional circumstances.

Knill, Cory; Snyder, Michael [Department of Radiation Oncology, Karmanos Cancer Center, Detroit, Michigan 48201 (United States); Department of Radiation Oncology, Karmanos Cancer Center, Detroit, Michigan 48201 and Wayne State University, Detroit, Michigan 48201 (United States)

2011-04-15

181

The Subcellular Distribution of Small Molecules: A Meta-Analysis

To explore the extent to which current knowledge about the organelle-targeting features of small molecules may be applicable towards controlling the accumulation and distribution of exogenous chemical agents inside cells, molecules with known subcellular localization properties (as reported in the scientific literature) were compiled into a single data set. This data set was compared to a reference data set of approved drug molecules derived from the DrugBank database, and to a reference data set of random organic molecules derived from the PubChem database. Cheminformatic analysis revealed that molecules with reported subcellular localizations were comparably diverse. However, the calculated physicochemical properties of molecules reported to accumulate in different organelles were markedly overlapping. In relation to the reference sets of Drug Bank and Pubchem molecules, molecules with reported subcellular localizations were biased towards larger, more complex chemical structures possessing multiple ionizable functional groups and higher lipophilicity. Stratifying molecules based on molecular weight revealed that many physicochemical properties trends associated with specific organelles were reversed in smaller vs. larger molecules. Most likely, these reversed trends are due to the different transport mechanisms determining the subcellular localization of molecules of different sizes. Molecular weight can be dramatically altered by tagging molecules with fluorophores or by incorporating organelle targeting motifs. Generally, in order to better exploit structure-localization relationships, subcellular targeting strategies would benefit from analysis of the biodistribution effects resulting from variations in the size of the molecules. PMID:21774504

Zheng, Nan; Tsai, Hobart Ng; Zhang, Xinyuan; Shedden, Kerby; Rosania, Gus R.

2011-01-01

182

Spectral Analysis of Distributions: Finding Periodic Components in Eukaryotic Enzyme Length Data

We introduce the spectral analysis of distributions (SAD), a method for detecting and eval- uating possible periodicity in experimental data distributions (histograms) of arbitrary shape. SAD determines whether a given empirical distribution contains a periodic compo- nent. We also propose a system of probabilistic mixture distributions to model a histogram consisting of a smooth background together with peaks at periodic

Eugene Kolker; Brian C. Tjaden; Robert Hubley; Edward N. Trifonov; Andrew F. Siegel

2002-01-01

183

Statistical structural analysis of rotor impact ice shedding

NASA Technical Reports Server (NTRS)

The statistical characteristics of impact ice shear strength are analyzed, with emphasis placed on the most probable shear strength and statistical distribution of an ice deposit. Several distribution types are considered: the Weibull, two-parameter Weibull, and exponential distributions, as well as the Gumbell distribution of the smallest extreme and the Gumbell distribution of the largest extreme. It is concluded that the Weibull distribution yields the best results; however, the expected life, shape parameter, and scale parameter should be determined separately for each case of varying wind speed and droplet size. The theoretical predictions of shear stresses in a specific rotating ice shape are compared, and it is noted that when the effects of lift are added to the theoretical model and the interference is calculated with a new mean and standard deviation, the probability of ice shed is computed as 36.64 pct.

Kellacky, C. J.; Chu, M. L.; Scavuzzo, R. J.

1991-01-01

184

Reliability of products is frequently a prime safety consideration. Interpretation of reliability is both quantitative and qualitative. Extensive quantitative analysis employing probablistic risk assessment has been widely performed to provide predicted hazard or accident minimization. Weibull probability data and information is a vital tool of these quantitative risk assessments, but so are qualitative methods such as fault tree analysis. Qualitative

David P. Weber

1994-01-01

185

Statistical distribution of mechanical properties for three graphite-epoxy material systems

NASA Technical Reports Server (NTRS)

Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

Reese, C.; Sorem, J., Jr.

1981-01-01

186

Toward Visualization and Analysis of Traceability Relationships in Distributed and Offshore Software

Toward Visualization and Analysis of Traceability Relationships in Distributed and Offshore Relationships, Dependencies, Visualization, Distributed and Global Software Development, Offshore Software@ics.uci.edu Abstract. Offshore software development projects provoke new issues to the collaborative endeavor

Redmiles, David F.

187

Automated local bright feature image analysis of nuclear protein distribution identifies

arrest proliferation, undergo apoptosis, or differentiate, distribution of nuclear proteins changes proliferation and differentiation (1, 2). Several nuclear proteins have been reported to display a specificAutomated local bright feature image analysis of nuclear protein distribution identifies changes

Knowles, David William

188

Crystal Size Distribution analysis of Merapi Andesites and cogenetic inclusions

NASA Astrophysics Data System (ADS)

Mt. Merapi in Central Java, Indonesia, is one of the most active stratovolcanoes on Earth [1]. Previous petrographic and geochemical studies [e.g. 2-4] identified the presence of a large plumbing system with a complex and multiphase history involving accumulation, magma mixing, crustal assimilation, equilibration and degassing. This study uses Crystal Size Distribution (CSD) analysis [5, 6] of recent Merapi eruptive products to assess if the different crystal populations and the complex history identified by geochemical techniques can be resolved by CSD. Moreover, CSD analysis can give additional information on processes occurring during storage and ascent as well as information on the timing of such processes [5, 6]. CSD analyses of plagioclase in four Merapi andesites indicate the presence of two main crystal populations. However, geochemical studies identified up to four plagioclase crystal types based on major and isotope chemistry. The comparison of isotope and major element transects with the CSD analyses indicate that phenocrysts >2 mm, represent a period of crystal growth that involved accumulation and crustal assimilation. This is followed by textural re-equilibration and a second phase of crystal growth, represented by plagioclase phenocrysts and microcrysts <2 mm in size. CSDs of igneous medium to coarse grained cogenetic inclusions indicate a third <1 mm crystal population, potentially indicating induced decompressional crystallisation by degassing and amphibole breakdown, and or crustal assimilation. Secondary processes such as resorption and compaction are recorded in the omission of intermediate crystal sizes in CSD plots for some igneous inclusions. Although CSD analysis provides additional information about processes that occur beneath Merapi, petrographic and geochemical information were required to draw unique conclusions. Fluctuations in magma chamber conditions recorded by crystal geochemistry did not always form a new textural crystal population resolvable by CSD analysis. Merapi has a long eruptive history with emission of significant volumes of magma and regular replenishment. In order to produce the relatively homogenised whole rock and CSD data a large steady state magmatic system beneath Merapi is required. References: [1] Voight et al., 2000, J. Volcanol. Geotherm. Res. 100, 1-8 [2] Hammer et al., 2000, J. Volcanol. Geotherm. Res. 100, 165-192 [3] Gertisser & Keller, 2003, J. Petrol. 44, 457-489 [4] Chadwick et al., 2007, J. Petrol. 48, 1793-1812 [5] Chadwick, 2008, PhD Thesis, University Dublin [6] Marsh, 1988, Cont. Mineral. Petrol. 99, 277-291 [7] Higgins, 2006, Cambridge University Press

van der Zwan, F. M.; Chadwick, J. P.; Troll, V. R.

2009-04-01

189

We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential

Naoya Sazuka

2006-01-01

190

Electron microprobe analysis of elemental distribution in excavated human femurs

Elemental distributions have been determined for femur cross sections of eight individuals from the Gibson and Ledders Woodland sites. The analyses were obtained by x-ray fluorescence with a scanning electron microscope. Movement of an element from soil to bone should give rise to inhomogeneous distributions within the bone. We found that the distributions of zinc, strontium, and lead are homogeneous

Joseph B. Lambert; Sharon Vlasak Simpson; Jane E. Buikstra; Douglas Hanson

1983-01-01

191

Structural reliability analysis of laminated CMC components

NASA Technical Reports Server (NTRS)

For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.

Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.

1991-01-01

192

Analysis of the Strehl ratio using the Wigner distribution function

The on-axis intensity distribution, or the Strehl ratio versus defocus, is described in a phase-space domain in terms of the Wigner distribution function. A coordinate transformation that is defined for a general annular aperture is employed to convert the pupil amplitude transmittance into a one-dimensional function. We show that the single display of the Wigner distribution function associated with this

Dobryna Zalvidea; Mario Lehman; Sergio Granieri; Enrique E. Sicre

1995-01-01

193

A pseudo-Bertrand distribution for time-scale analysis

Using the pseudo-Wigner time-frequency distribution as a guide, we derive two new time-scale representations: the pseudo-Bertrand and the smoothed pseudo-Bertrand distributions. Unlike the Bertrand distribution, these representations support efficient online operation at the same computational cost as the continuous wavelet transform. Moreover, they take advantage of the affine smoothing inherent in the sliding structure of their implementation to suppress cumbersome

P. Goncalves; R. G. Baraniuk

1996-01-01

194

Analysis of aerosol vertical distribution and variability in Hong Kong

NASA Astrophysics Data System (ADS)

Aerosol vertical distribution is an important piece of information to improve aerosol retrieval from satellite remote sensing. Aerosol extinction coefficient profile and its integral form, aerosol optical depth (AOD), as well as atmospheric boundary layer (ABL) height and haze layer height can be derived using lidar measurements. In this paper, we used micropulse lidar measurements acquired from May 2003 to June 2004 to illustrate seasonal variations of AOD and ABL height in Hong Kong. On average, about 64% of monthly mean aerosol optical depths were contributed by aerosols within the mixing layer (with a maximum (˜76%) in November and a minimum (˜55%) in September) revealing the existence of large abundance of aerosols above ABL due to regional transport. The characteristics of seasonal averaged aerosol profiles over Hong Kong in the study period are presented to illustrate seasonal phenomena of aerosol transport and associated meteorological conditions. The correlation between AOD and surface extinction coefficient, as found, is generally poor (r2 ˜0.42) since elevated aerosol layers increase columnar aerosol abundance but not extinction at surface. The typical aerosol extinction profile in the ABL can be characterized by a low value near the surface and values increased with altitude reaching the top of ABL. When aerosol vertical profile is assumed, surface extinction coefficient can be derived from AOD using two algorithms, which are discussed in detail in this paper. Preliminary analysis showed that better estimates of the extinction coefficient at the ground level could be obtained using two-layer aerosol extinction profiles (r2 ˜0.78, slope ˜0.82, and intercept ˜0.15) than uniform profiles of extinction with height within the ABL (r2 ˜0.65, slope ˜0.27, and intercept ˜0.03). The improvement in correlation is promising on mapping satellite retrieved AOD to surface aerosol extinction coefficient for urban and regional environmental studies on air quality related issues.

He, Qianshan; Li, Chengcai; Mao, Jietai; Lau, Alexis Kai-Hon; Chu, D. A.

2008-07-01

195

Analysis Model for Domestic Hot Water Distribution Systems: Preprint

A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.

Maguire, J.; Krarti, M.; Fang, X.

2011-11-01

196

Evaluation of Distribution Analysis Software for DER Applications

The term ''Distributed energy resources'' or DER refers to a variety of compact, mostly self-contained power-generating technologies that can be combined with energy management and storage systems and used to improve the operation of the electricity distribution system, whether or not those technologies are connected to an electricity grid. Implementing DER can be as simple as installing a small electric

Staunton

2003-01-01

197

A global analysis of root distributions for terrestrial biomes

Understanding and predicting ecosystem functioning (e.g., carbon and water fluxes) and the role of soils in carbon storage requires an accurate assessment of plant rooting distributions. Here, in a comprehensive literature synthesis, we analyze rooting patterns for terrestrial biomes and compare distributions for various plant functional groups. We compiled a database of 250 root studies, subdividing suitable results into 11

R. B. Jackson; J. Canadell; J. R. Ehleringer; H. A. Mooney; O. E. Sala; E. D. Schulze

1996-01-01

198

Analysis and Determination of Ring Flux Distribution in Hysteresis Motors

This paper presents the results of an experimental determination of the flux distributions inside different layers of the hysteresis ring. The flux measurements were made by means of search coils embedded in the stationary ring of an inside-out three phase hysteresis motor. It is found that the flux distribution is quite nonuniform and there exist phase shift between the flux

R. D. Jackson; M. A. Rahman; G. R. Slemon

1983-01-01

199

Analysis and Determination of Ring Flux Distribution in Hysteresis Motors

This paper presents the results of an experimental determination of the flux distributions inside different layers of the hysteresis ring. The flux measurements were made by means of search coils embedded in the stationary ring of an inside-out three phase hysteresis motor. It is found that the flux distribution is quite non-uniform and there exist phase shift between the flux

R. D. Jackson; M. A. Rahman; G. R. Slemon

1983-01-01

200

The term “progressive credit periods” offered by the supplier to the retailer for settling the account is defined as follows:\\u000a If the retailer settles account by credit period M, then the supplier does not charge any interest. If the retailer pays after\\u000a M but before N (N?>?M), then the supplier charges the retailer on an un–paid amount at the interest

Nita H. Shah; Poonam Pandey; Hardik Soni

2011-01-01

201

Analysis of Brillouin-Based Distributed Fiber Sensors Using Optical Pulse Coding

Analysis of Brillouin-Based Distributed Fiber Sensors Using Optical Pulse Coding Prasant K. Sahu1) Fiber optics sensors, (290.5830) Scattering, Brillouin. 1. Introduction Fiber optic distributed sensors-distance applications. In particular, the main advantage of fiber-based distributed temperature sensors, compared

Park, Namkyoo

202

Control and Analysis of Droop and Reverse Droop Controllers for Distributed Generations

--Distributed generation, droop control, reverse droop control. I. INTRODUCTION With the fast development of powerControl and Analysis of Droop and Reverse Droop Controllers for Distributed Generations Dan Wu1 and reverse droop control for distributed generations (DG). The droop control is well known applied

Vasquez, Juan Carlos

203

Analysis of random laser scattering pulse signals with lognormal distribution

NASA Astrophysics Data System (ADS)

The statistical distribution of natural phenomena is of great significance in studying the laws of nature. In order to study the statistical characteristics of a random pulse signal, a random process model is proposed theoretically for better studying of the random law of measured results. Moreover, a simple random pulse signal generation and testing system is designed for studying the counting distributions of three typical objects including particles suspended in the air, standard particles, and background noise. Both normal and lognormal distribution fittings are used for analyzing the experimental results and testified by chi-square distribution fit test and correlation coefficient for comparison. In addition, the statistical laws of three typical objects and the relations between them are discussed in detail. The relation is also the non-integral dimension fractal relation of statistical distributions of different random laser scattering pulse signal groups.

Yan, Zhen-Gang; Bian, Bao-Min; Wang, Shou-Yu; Lin, Ying-Lu; Wang, Chun-Yong; Li, Zhen-Hua

2013-06-01

204

NASA Astrophysics Data System (ADS)

As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

Taravat, A.; Del Frate, F.

2013-09-01

205

ERIC Educational Resources Information Center

In the article "Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data," Dinno (this issue) provides strong evidence that the distribution of random data does not have a significant influence on the outcome of the analysis. Hayton appreciates the thorough approach to evaluating this assumption, and agrees…

Hayton, James C.

2009-01-01

206

Analysis of Distributed Reservation Protocol for UWB-based WPANs with ECMA-368 MAC

Analysis of Distributed Reservation Protocol for UWB-based WPANs with ECMA-368 MAC Nasim Arianpoo for the performance analysis of the medium access control (MAC) protocol standardized in ECMA-368. The MAC protocol (MAP) and phase type distribution (PH), we model this MAC layer as a MAP/PH/1 queueing system

Wong, Vincent

207

An analysis of amplitude distribution of acoustic emission (AE) signals during fatigue testing of low-strength, low carbon steel specimens used extensively in the fabrication of off-shore structures was carried out. The test results show three different amplitude distribution patterns which coincide with the three stages of fatigue reported in the literature. A statistical analysis showed that the statistical parameters like

R. Visweswaran; M. Manoharan; G. Jothinathan; O. Prabhakar

1985-01-01

208

Distributed Control and Stochastic Analysis of Hybrid Systems Supporting Safety Critical Real and Stochastic Analysis of Hybrid Systems Supporting Safety Critical Real-Time Systems Design (HYBRIDGE) DOCUMENT-Time Systems Design WP9: Risk assessment for a distributed control system Sequential Monte Carlo simulation

Del Moral , Pierre

209

Theory & motivation Physics Analysis Results Summary Scaled momentum distributions of charged

Theory & motivation Physics Analysis Results Summary Scaled momentum distributions of charged.morris@cern.ch DIS09 29th April 2009 1 / 18 #12;Theory & motivation Physics Analysis Results Summary Talk outline 1 Theoretical framework: The MLLA and LPHD; Physics motivation. 2 Physics Analysis: Analysis strategy; Data

210

Numerical Analysis of a Cold Air Distribution System

Cold air distribution systems may reduce the operating energy consumption of air-conditioned air supply system and improve the outside air volume percentages and indoor air quality. However, indoor temperature patterns and velocity field are easily...

Zhu, L.; Li, R.; Yuan, D.

2006-01-01

211

Environment for Test and Analysis of Distributed Software (ETADS).

National Technical Information Service (NTIS)

This final report describes results of the Phase I SBIR research effort to develop new software testing techniques capable of satisfying the demands of distributed real-time software environments. Traditional software testing techniques are inadequate for...

R. C. Cox, B. Allen

1994-01-01

212

Thermal Analysis of Antenna Structures. Part 2: Panel Temperature Distribution

NASA Technical Reports Server (NTRS)

This article is the second in a series that analyzes the temperature distribution in microwave antennas. An analytical solution in a series form is obtained for the temperature distribution in a flat plate analogous to an antenna surface panel under arbitrary temperature and boundary conditions. The solution includes the effects of radiation and air convection from the plate. Good agreement is obtained between the numerical and analytical solutions.

Schonfeld, D.; Lansing, F. L.

1983-01-01

213

CO? concentrations recorded for two years using a Picarro G1301 analyser at a rural site were studied applying two procedures. Firstly, the smoothing kernel method, which to date has been used with one linear and another circular variable, was used with pairs of circular variables: wind direction, time of day, and time of year, providing that the daily cycle was the prevailing cyclical evolution and that the highest concentrations were justified by the influence of one nearby city source, which was only revealed by directional analysis. Secondly, histograms were obtained, and these revealed most observations to be located between 380 and 410 ppm, and that there was a sharp contrast during the year. Finally, histograms were fitted to 14 distributions, the best known using analytical procedures, and the remainder using numerical procedures. RMSE was used as the goodness of fit indicator to compare and select distributions. Most functions provided similar RMSE values. However, the best fits were obtained using numerical procedures due to their greater flexibility, the triangular distribution being the simplest function of this kind. This distribution allowed us to identify directions and months of noticeable CO? input (SSE and April-May, respectively) as well as the daily cycle of the distribution symmetry. Among the functions whose parameters were calculated using an analytical expression, Erlang distributions provided satisfactory fits for monthly analysis, and gamma for the rest. By contrast, the Rayleigh and Weibull distributions gave the worst RMSE values. PMID:23602977

Pérez, Isidro A; Sánchez, M Luisa; García, M Ángeles; Pardo, Nuria

2013-07-01

214

The distribution of first-passage times and durations in FOREX and future markets

Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely, a distribution derived from the so-called Mittag-Leffler survival function and the Weibull distribution. For Mittag-Leffler type distribution, the average waiting time (residual life

Naoya Sazuka; Jun-ichi Inoue; Enrico Scalas

2008-01-01

215

The distribution of first-passage times and durations in FOREX and future markets

Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag–Leffler survival function and the Weibull distribution. For the Mittag–Leffler type distribution, the average waiting time (residual life

Naoya Sazuka; Jun-Ichi Inoue; Enrico Scalas

2009-01-01

216

Income distribution dependence of poverty measure: A theoretical analysis

NASA Astrophysics Data System (ADS)

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the ‘global’ mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy.

Chattopadhyay, Amit K.; Mallick, Sushanta K.

2007-04-01

217

Analysis of magnetic electron lens with secant hyperbolic field distribution

Electron-optical imaging instruments like Scanning Electron Microscope (SEM) and Transmission Electron Microscope (TEM) use specially designed solenoid electromagnets for focusing of electron beam probe. Indicators of imaging performance of these instruments, like spatial resolution, have strong correlation with focal characteristics of the magnetic lenses which in turn have been shown to be functions of the spatial distribution of axial magnetic field generated by them. Owing to complicated design of practical lenses, empirical mathematical expressions are deemed convenient for use in physics based calculations of their focal properties. So, degree of closeness of such models to the actual field distribution determines accuracy of the calculations. Mathematical models proposed by Glaser[1] and Ramberg[1] have historically been put into extensive use. In this paper the authors discuss one such model with secant-hyperbolic type magnetic field distribution function, and present a comparison among these models, ...

Pany, S S; Dubey, B P

2014-01-01

218

Inductance and Current Distribution Analysis of a Prototype HTS Cable

NASA Astrophysics Data System (ADS)

This project is partly supported by NSFC Grant 51207146, RAEng Research Exchange scheme of UK and EPSRC EP/K01496X/1. Superconducting cable is an emerging technology for electricity power transmission. Since the high power capacity HTS transmission cables are manufactured using a multi-layer conductor structure, the current distribution among the multilayer structure would be nonuniform without proper optimization and hence lead to large transmission losses. Therefore a novel optimization method has been developed to achieve evenly distributed current among different layers considering the HTS cable structure parameters: radius, pitch angle and winding direction which determine the self and mutual inductance. A prototype HTS cable has been built using BSCCO tape and tested to validate the design the optimal design method. A superconductor characterization system has been developed using the Labview and NI data acquisition system. It can be used to measure the AC loss and current distribution of short HTS cables.

Zhu, Jiahui; Zhang, Zhenyu; Zhang, Huiming; Zhang, Min; Qiu, Ming; Yuan, Weijia

2014-05-01

219

Analysis of recording in bit patterned media with parameter distributions

NASA Astrophysics Data System (ADS)

Recording in bit patterned media (BPM) requires strict synchronization of the signal misregistration time. A scheme is presented based on micromagnetic simulation of a single element which allows defining a writing window (WW) permitting for synchronized recording. The WW behavior for random distributions of the anisotropy field is studied. The width of WW is shown to be determined by the medium parameters deviations, the BPM element separation, and the head field strength. It is shown that significant limitations can be imposed on the BPM density and required head fields because of the random distributions of the BPM properties.

Livshitz, Boris; Inomata, Akihiro; Bertram, H. Neal; Lomakin, Vitaliy

2009-04-01

220

Aggregate Characterization of User Behavior in Twitter and Analysis of the Retweet Graph

Most previous analysis of Twitter user behavior is focused on individual information cascades and the social followers graph. We instead study aggregate user behavior and the retweet graph with a focus on quantitative descriptions. We find that the lifetime tweet distribution is a type-II discrete Weibull stemming from a power law hazard function, the tweet rate distribution, although asymptotically power law, exhibits a lognormal cutoff over finite sample intervals, and the inter-tweet interval distribution is power law with exponential cutoff. The retweet graph is small-world and scale-free, like the social graph, but is less disassortative and has much stronger clustering. These differences are consistent with it better capturing the real-world social relationships of and trust between users. Beyond just understanding and modeling human communication patterns and social networks, applications for alternative, decentralized microblogging systems-both predicting real-word performance and detecting spam-are d...

Bild, David R; Dick, Robert P; Mao, Z Morley; Wallach, Dan S

2014-01-01

221

Analysis of sea clutter distribution variation with Doppler using the compound k-distribution

Sea clutter is the backscattered returns received by a radar system from the sea surface. Maritime radar signal processing has the ability to partially compensate for clutter to achieve effective detection of targets on or near the sea surface. This paper investigates the fit of the compound k-distribution model to sea clutter amplitude statistics, within individual Doppler bins across the

M. A. Ritchie; K. Woodbridge; A. G. Stove

2010-01-01

222

Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

ERIC Educational Resources Information Center

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

2009-01-01

223

Analysis of market price for distributed generators (DGs) in Microgrid

Renewable and nonconventional distributed energy resources (DERs), such as, wind, solar PV, microturbines, fuel cells, diesel generators etc. are gradually becoming more popular as energy efficient and low-emission energy sources. Recent deregulation initiatives in power and energy industry are currently influencing company decisions regarding the construction of new generation plants and transmission lines, and complying to clean air legislation for

Arup Sinha; R. N. Lahiri; Sanjay Neogi; S. Chowdhury; C. T. Gaunt

2009-01-01

224

An implementation and analysis of a randomized distributed stack

. This randomized distributed stack represents an experimental extension of the probabilistic quorum algorithm of Malki et al. [5,4] and the random regular register of Welch and Lee [3]. Employing the probabilistic quorum algorithm in the same manner as the random...

Kirkland, Dustin Charles

2013-02-22

225

THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

226

Metagenomic Analysis of Water Distribution System Bacterial Communities

The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...

227

Voltage analysis of distribution systems with DFIG wind turbines

Wind energy is becoming the most viable renewable energy source mainly because of the growing concerns over carbon emissions and uncertainties in fossil fuel supplies and the government policy impetus. The increasing penetration of wind power in distribution systems may significantly affect voltage stability of the systems, particularly during wind turbine cut-in and cut-off disturbances. Currently, doubly fed induction generator

Baohua Dong; Sohrab Asgarpoor; Wei Qiao

2009-01-01

228

Distributed simulation for power system analysis including shipboard systems

Power systems are distributed in nature. Often they can be divided into sections or groups and treated separately. Terrestrial power systems are divided into separate utilities and are controlled by different regional transmission organization (RTO). Each RTO has detailed data for the area under its control, but only limited data and boundary measurements of the external network. Additionally, shipboard power

Jian Wu; Noel N. Schulz; Wenzhong Gao

2007-01-01

229

A Logic for Information Flow Analysis of Distributed Programs

principals exchange sensitive informa- tion over a network, security and privacy issues arise immediately. For instance, in an online auction system we may want to ensure that no bidder knows the bids of any other these models are important in many settings, they are not obviously well suited for distributed programs where

Lagergren, Jens

230

Analysis of factors affecting color distribution of white LEDs

The color uniformity is a critical index in the evaluation of high quality white light emitting diodes (LEDs). The main factor affecting the color distribution is the state of the phosphor. The secondary factor is the optical structure. This paper analyzes two parameters of the phosphor layer (thickness and concentration) and six optical structures. Results indicate that the structures with

Zongyuan Liu; Sheng Liu; Kai Wang; Xiaobing Luo

2008-01-01

231

A network analysis of food flows within the United States of America.

The world food system is globalized and interconnected, in which trade plays an increasingly important role in facilitating food availability. We present a novel application of network analysis to domestic food flows within the USA, a country with global importance as a major agricultural producer and trade power. We find normal node degree distributions and Weibull node strength and betweenness centrality distributions. An unassortative network structure with high clustering coefficients exists. These network properties indicate that the USA food flow network is highly social and well-mixed. However, a power law relationship between node betweenness centrality and node degree indicates potential network vulnerability to the disturbance of key nodes. We perform an equality analysis which serves as a benchmark for global food trade, where the Gini coefficient = 0.579, Lorenz asymmetry coefficient = 0.966, and Hoover index = 0.442. These findings shed insight into trade network scaling and proxy free trade and equitable network architectures. PMID:24773310

Lin, Xiaowen; Dang, Qian; Konar, Megan

2014-05-20

232

SEMANTIC SEGMENTATION OF RADIO PROGRAMS USING SOCIAL NETWORK ANALYSIS AND DURATION DISTRIBUTION: the first is based on Social Network Analysis, the second is based on Pois- son Stochastic Processes, the results are promising and encourage to continue with the application of Social Network Analysis

233

To make stochastic (probabilistic) failure predictions of a conventional or highly crosslinked ultrahigh molecular weight polyethylene (UHMWPE) material, not only must a failure criterion be defined, but it is also necessary to specify a probability distribution of the failure strength. This study sought to evaluate both parametric and nonparametric statistical approaches to describing the failure properties of UHMWPE, based on the Normal and Weibull model distributions, respectively. Because fatigue and fracture properties of materials have historically been well described with the use of Weibull statistics, it was expected that a nonparametric approach would provide a better fit of the failure distributions than the parametric approach. The ultimate true stress, true strain, and ultimate chain stretch data at failure were analyzed from 60 tensile tests conducted previously. The ultimate load and ultimate displacement from 121 small punch tests conducted previously were also analyzed. It was found that both Normal and Weibull models provide a reasonable description of the central tendency of the failure distribution. The principal difference between the Normal and Weibull models can be appreciated in the predicted lower-bound response at the tail end of the distribution. The data support the use of both parametric and nonparametric methods to bracket the lower-bound failure prediction in order to simulate the failure threshold for UHMWPE. PMID:15772963

Kurtz, S M; Bergström, J; Rimnac, C M

2005-05-01

234

ERIC Educational Resources Information Center

This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

2013-01-01

235

Directional data analysis under the general projected normal distribution

The projected normal distribution is an under-utilized model for explaining directional data. In particular, the general version provides flexibility, e.g., asymmetry and possible bimodality along with convenient regression specification. Here, we clarify the properties of this general class. We also develop fully Bayesian hierarchical models for analyzing circular data using this class. We show how they can be fit using MCMC methods with suitable latent variables. We show how posterior inference for distributional features such as the angular mean direction and concentration can be implemented as well as how prediction within the regression setting can be handled. With regard to model comparison, we argue for an out-of-sample approach using both a predictive likelihood scoring loss criterion and a cumulative rank probability score criterion. PMID:24046539

Wang, Fangpo; Gelfand, Alan E.

2013-01-01

236

Nonlinear structural analysis on distributed-memory computers

NASA Technical Reports Server (NTRS)

A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a nested dissection (or multilevel substructuring) ordering scheme; (3) parallel assembly of global matrices; and (4) a parallel sparse equation solver. The effectiveness of the strategy is assessed by applying it to thermo-mechanical postbuckling analyses of stiffened composite panels with cutouts, and nonlinear large-deflection analyses of HSCT models on Intel Paragon XP/S computers. The numerical studies presented demonstrate the advantages of nested dissection-based solvers over traditional skyline-based solvers on distributed memory machines.

Watson, Brian C.; Noor, Ahmed K.

1995-01-01

237

Rapid Spatial Distribution Seismic Loss Analysis for Multistory Buildings

and incremental dynamic analysis along with the commercial software SAP2000 are used to establish demands from which story damage and financial losses are computed directly and aggregated for the entire structure. Rigorous and simplified methods are developed...

Deshmukh, Pankaj Bhagvatrao

2012-07-16

238

Reliability assessment in electrical power systems: the Weibull-Markov stochastic model

The field of power system reliability is dominated by the use of homogenous Markov models. The negative exponential distributions in these models, however, are unrealistic in the case of repair or switching times. The use of homogenous Markov models is often justified by arguing that the use of other models makes it impossible to perform analytical or nonsequential calculations. It

Jasper F. L. van Casteren; Math H. J. Bollen; Martin E. Schmieg

2000-01-01

239

Reliability assessment in electrical power systems: the Weibull-Markov stochastic model

The field of power system reliability is dominated by the use of homogeneous Markov models. The negative exponential distributions in these models, however, are unrealistic in the case of repair or switching times. The use of homogeneous Markov models is often justified by arguing that the use of other models makes it impossible to perform analytical or nonsequential calculations. A

Jasper van Casteren; Math Bollen; Martin Schmieg

1999-01-01

240

Statistical analysis of dendritic spine distributions in rat hippocampal cultures

Background Dendritic spines serve as key computational structures in brain plasticity. Much remains to be learned about their spatial and temporal distribution among neurons. Our aim in this study was to perform exploratory analyses based on the population distributions of dendritic spines with regard to their morphological characteristics and period of growth in dissociated hippocampal neurons. We fit a log-linear model to the contingency table of spine features such as spine type and distance from the soma to first determine which features were important in modeling the spines, as well as the relationships between such features. A multinomial logistic regression was then used to predict the spine types using the features suggested by the log-linear model, along with neighboring spine information. Finally, an important variant of Ripley’s K-function applicable to linear networks was used to study the spatial distribution of spines along dendrites. Results Our study indicated that in the culture system, (i) dendritic spine densities were "completely spatially random", (ii) spine type and distance from the soma were independent quantities, and most importantly, (iii) spines had a tendency to cluster with other spines of the same type. Conclusions Although these results may vary with other systems, our primary contribution is the set of statistical tools for morphological modeling of spines which can be used to assess neuronal cultures following gene manipulation such as RNAi, and to study induced pluripotent stem cells differentiated to neurons. PMID:24088199

2013-01-01

241

Structure-function analysis and psi, jet, W, and Z production: Determining the gluon distribution

We perform a next-to-leading-order structure-function analysis of deep-inelastic ..mu..N and ..nu..N scattering data and find acceptable fits for a range of input gluon distributions. We show three equally acceptable sets of parton distributions which correspond to gluon distributions which are (1) ''soft,'' (2) ''hard,'' and (3) which behave as xG(x)approx.1\\/ ..sqrt..x at small x. J\\/psi and prompt photon hadroproduction data

A. D. Martin; R. G. Roberts; W. J. Stirling

1988-01-01

242

Analysis of the 3D distribution of stacked self-assembled quantum dots by electron tomography

NASA Astrophysics Data System (ADS)

The 3D distribution of self-assembled stacked quantum dots (QDs) is a key parameter to obtain the highest performance in a variety of optoelectronic devices. In this work, we have measured this distribution in 3D using a combined procedure of needle-shaped specimen preparation and electron tomography. We show that conventional 2D measurements of the distribution of QDs are not reliable, and only 3D analysis allows an accurate correlation between the growth design and the structural characteristics.

Hernández-Saz, Jesús; Herrera, Miriam; Alonso-Álvarez, Diego; Molina, Sergio I.

2012-12-01

243

Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters

electrically. In other words, electrical power will be utilized for driving aircraft subsystems currently1 Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters Sriram Chandrasekaran, Douglas K. Lindner, Konstantin Louganski and, Dushan Boroyevich Center for Power Electronics

Lindner, Douglas K.

244

A Critique of Distributional Analysis In the Spatial Model Craig A. Tovey

voting data. 1 Introduction Distributional analysis has been a widely used technique in the study by a National Science Foundation Presidential Young Investigator Award ECS-8451032. A portion of this work

Tovey, Craig A.

245

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ...

Hunter, David J.

246

Use of a moments method for the analysis of flux distributions in subcritical assemblies

A moments method has been developed for the analysis of flux distributions in subcritical neutron-multiplying assemblies. The method determines values of the asymptotic axial and radial buckling, and of the extrapolated ...

Cheng, Hsiang-Shou

1968-01-01

247

Finite element analysis of moisture distribution and hygrothermal stresses in TSOP IC packages

In this study, the finite element analysis of moisture absorption and residual stresses in plastic encapsulated IC packages is presented. During the moisture soaking test, moisture distributions in plastic encapsulated IC packages are evaluated by solving the diffusion equations. Thin LOC TSOP packages in various moisture soaking conditions are considered. The effects of temperature and humidity conditions on moisture distributions

Sung Yi; Kam Yim Sze

1998-01-01

248

Efficient Algorithm for ``On-the-Fly'' Error Analysis of Local or Distributed Serially Correlated.interscience.wiley.com). Abstract: We describe the Dynamic Distributable Decorrelation Algorithm (DDDA) which efficiently calculates of a calculation efficiently ``on-the-fly.'' Quantum Monte Carlo calcu- lations are presented to illustrate ``on-the-fly

Goddard III, William A.

249

An Analysis of Congruence Gaps and Their Effect on Distributed Software Development

structures, there is a hidden coordination cost. Modularity and even agile methods have had mixed success1 An Analysis of Congruence Gaps and Their Effect on Distributed Software Development 1 Copyright@ie.ibm.com Clay Williams IBM Research clayw@us.ibm.com ABSTRACT Software projects are frequently distributed

Valetto, Giuseppe "Peppo"

250

From a sample of 149 unrelated Spaniards, individuals were phenotyped for their ability to hydroxylate debrisoquin and O-demethylate dextromethorphan. The distribution of urinary metabolic ratios for each test was analyzed by univariate gaussian mixture distributions analysis to determine the number of populations, the mean and standard deviation of the metabolic ratios for each population, and the proportion belonging to each

Thomas K Henthorn; Julio Benitez; Michael J Avram; Carmen Martinez; Adrián Llerena; Jesús Cobaleda; Tom C Krejcie; Robert D Gibbons

1989-01-01

251

Single Cell Analysis of Drug Distribution by Intravital Imaging

Recent advances in the field of intravital imaging have for the first time allowed us to conduct pharmacokinetic and pharmacodynamic studies at the single cell level in live animal models. Due to these advances, there is now a critical need for automated analysis of pharmacokinetic data. To address this, we began by surveying common thresholding methods to determine which would be most appropriate for identifying fluorescently labeled drugs in intravital imaging. We then developed a segmentation algorithm that allows semi-automated analysis of pharmacokinetic data at the single cell level. Ultimately, we were able to show that drug concentrations can indeed be extracted from serial intravital imaging in an automated fashion. We believe that the application of this algorithm will be of value to the analysis of intravital microscopy imaging particularly when imaging drug action at the single cell level. PMID:23593370

Giedt, Randy J.; Koch, Peter D.; Weissleder, Ralph

2013-01-01

252

Economic Analysis of Trickle Distribution System Texas High Plains.

. The systems were evaluated for producing cotton and sorghum in solid a n d double-row planting methods. Estimated investment ranged from $49.19 to $60.61 per acre for the movable surface systems. For automated subsurface systems. estimated investment... requirements per acre ranged from $562.57 to $1 .860. 17. In vestment requirements per acre for the furrow dis tribution sys tem was estimated to be $62 .74. The lowest estimated costs per acre for cotton for the movable trickle distribution systems were...

Osborn, James E.; Young, Alan M.; Wilke, Otto C.; Wendt, Charles

1977-01-01

253

Analysis of the spatial distribution between successive earthquakes.

Spatial distances between subsequent earthquakes in southern California exhibit scale-free statistics, with a critical exponent delta approximately 0.6, as well as finite size scaling. The statistics are independent of the threshold magnitude as long as the catalog is complete, but depend strongly on the temporal ordering of events, rather than the geometry of the spatial epicenter distribution. Nevertheless, the spatial distance and waiting time between subsequent earthquakes are uncorrelated with each other. These observations contradict the theory of aftershock zone scaling with main shock magnitude. PMID:15783608

Davidsen, Jörn; Paczuski, Maya

2005-02-01

254

The Poisson distribution is the most widely recognised and commonly used distribution for cytogenetic radiation biodosimetry. However, it is recognised that, due to the complexity of radiation exposure cases, other distributions may be more properly applied. Here, the Poisson, gamma, negative binomial, beta, Neyman type-A and Hermite distributions are compared in terms of their applicability to 'real-life' radiation exposure situations. The identification of the most appropriate statistical model in each particular exposure situation more correctly characterises data. The results show that for acute, homogeneous (whole-body) exposures, the Poisson distribution can still give a good fit to the data. For localised partial-body exposures, the Neyman type-A model was found to be the most robust. Overall, no single distribution was found to be universally appropriate. A distribution-specific method of analysis of cytogenetic data is therefore recommended. Such an approach may lead potentially to more accurate biological dose estimates. PMID:23325781

Ainsbury, Elizabeth A; Vinnikov, Volodymyr A; Maznyk, Nataliya A; Lloyd, David C; Rothkamm, Kai

2013-07-01

255

0 50 100 150 FIGURE 3. The arctangent distribution t to the rat cancer data.

survivor functions for the ball bearing lifetimes. 13 #12;TABLE 1. Kolmogorov{Smirnov Goodness-of- t Statistics for the Ball Bearing Data. Distribution D23 Exponential 0.301 Weibull 0.152 Gamma 0.123 Inverse). \\Statistical Investigation of the Fatigue Life of Deep-Groove Ball Bearings." Journal of Research

Leemis, Larry

256

Measuring the atmospheric organic aerosol volatility distribution: a theoretical analysis

NASA Astrophysics Data System (ADS)

Organic compounds represent a significant fraction of submicrometer atmospheric aerosol mass. Even if most of these compounds are semi-volatile in atmospheric concentrations, the ambient organic aerosol volatility is quite uncertain. The most common volatility measurement method relies on the use of a thermodenuder (TD). The aerosol passes through a heated tube where its more volatile components evaporate, leaving the less volatile components behind in the particulate phase. The typical result of a thermodenuder measurement is the mass fraction remaining (MFR), which depends, among other factors, on the organic aerosol (OA) vaporization enthalpy and the accommodation coefficient. We use a new method combining forward modeling, introduction of "experimental" error, and inverse modeling with error minimization for the interpretation of TD measurements. The OA volatility distribution, its effective vaporization enthalpy, the mass accommodation coefficient and the corresponding uncertainty ranges are calculated. Our results indicate that existing TD-based approaches quite often cannot estimate reliably the OA volatility distribution, leading to large uncertainties, since there are many different combinations of the three properties that can lead to similar thermograms. We propose an improved experimental approach combining TD and isothermal dilution measurements. We evaluate this experimental approach using the same model, and show that it is suitable for studies of OA volatility in the lab and the field.

Karnezi, E.; Riipinen, I.; Pandis, S. N.

2014-09-01

257

Measuring the atmospheric organic aerosol volatility distribution: a theoretical analysis

NASA Astrophysics Data System (ADS)

Organic compounds represent a significant fraction of submicrometer atmospheric aerosol mass. Even if most of these compounds are semi-volatile in atmospheric concentrations, the ambient organic aerosol volatility is quite uncertain. The most common volatility measurement method relies on the use of a thermodenuder (TD). The aerosol passes through a heated tube where its more volatile components evaporate leaving the less volatile behind in the particulate phase. The typical result of a~thermodenuder measurement is the mass fraction remaining (MFR), which depends among other factors on the organic aerosol (OA) vaporization enthalpy and the accommodation coefficient. We use a new method combining forward modeling, introduction of "experimental" error and inverse modeling with error minimization for the interpretation of TD measurements. The OA volatility distribution, its effective vaporization enthalpy, the mass accommodation coefficient and the corresponding uncertainty ranges are calculated. Our results indicate that existing TD-based approaches quite often cannot estimate reliably the OA volatility distribution, leading to large uncertainties, since there are many different combinations of the three properties that can lead to similar thermograms. We propose an improved experimental approach combining TD and isothermal dilution measurements. We evaluate this experimental approach using the same model and show that it is suitable for studies of OA volatility in the lab and the field.

Karnezi, E.; Riipinen, I.; Pandis, S. N.

2014-01-01

258

Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina)] [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States)] [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina) [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

2009-05-22

259

Heterogeneity effects in radiation dose distribution analysis for BNCT

Calculation of the various radiation dose components that will exist in the treatment volume during boron neutron capture therapy (BNCT) is a complex, three-dimensional problem. These components all have different spatial distributions and relative biological effectiveness (RBE). The typical approach to such calculations has been to approximate the highly heterogeneous calculational geometry of the irradiation volume by either a spatially homogenized model or by a simplified few-region model. The accuracy of such models should be validated by comparison with calculated results obtained by modeling the actual heterogeneous geometry and tissue variations as faithfully as possible. The results of such an exercise for the geometry of the canine head are presented. There are basically three types of tissue-heterogeneity effects that influence radiation dose distributions in BNCT. First, macroscopic spatial fluence perturbations in BNCT. First, macroscopic spatial fluence perturbations can occur as a result of spatial variations in the radiation transport properties of the tissues in the irradiation volume. Second, tissues with different elemental compositions will have different kerma factors and, therefore, different absorbed doses, even in the same radiation fluence. Finally, there are macroscopic and microscopic effects caused by local secondary charged-particle imbalance in heterogeneous media. The work presented in this paper concentrates on the first type of heterogeneity effects.

Moran, J.M.; Nigg, D.W. (Idaho National Engineering Lab., Idaho Falls (United States))

1992-01-01

260

Analysis and Design Selection of Lightning Arrester for Distribution Substation

Abstract—Distribution substations feed power to the actual consumers through distributors and service lines. The main equipments are generators and transformers. To protected these equipments and for stability purpose, over-voltages and over currents protection are important to consider. Lightning is one of the most serious causes of over-voltage. If the power equipments especially at outdoor substation are not protected, the over-voltage will cause burning of insulation. Lightning arrester can protect the damages of equipments. This paper describes the arrester type, lightning terminal and earthing plan of Dagon East substation in Myanmar. DynaVar station class and intermediated arrester (Vrated = 72kV and I charge (max) = 10kA) are used in this substation. Most of substation equipments are designed to match with the insulation coordination. If the insulation equipments are higher, the cost is also high. So, to relax this, the lightning arrester must be put in front of the protected equipments and protected zone. For this purposes, this paper specially indicates the safety and saving cost of equipments for overvoltage protection in distribution substation. Keywords—Lightning arrester, Earthing plan, DynaVar station, Intermediated arrester.

Nay Kyi Htwe

261

Analysis of an algorithm for distributed recognition and accountability

Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approach for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.

Ko, C.; Frincke, D.A.; Goan, T. Jr.; Heberlein, L.T.; Levitt, K.; Mukherjee, B.; Wee, C. [California Univ., Davis, CA (United States). Dept. of Computer Science

1993-08-01

262

Project ENDEAVOR: distributed modeling for advanced marine vehicle performance analysis

Project ENDEAVOR (Booij et al., 1999) (Environment for Design of Advanced Marine Vehicles and Operations Research) has devised a unique capability to support advanced marine vehicles (AMVs) in the areas of mission planning, design, and performance analysis. The Project has established an 8-year global NOAA WaveWatch III (WWIII) deepwater condition database to feed the Simulation of Waves Nearshore (SWAN)2 regional

Donald Fabozzi; J. Bergquist; John Winship; Demont Hansen

2005-01-01

263

Channel flow analysis. [velocity distribution throughout blade flow field

NASA Technical Reports Server (NTRS)

The design of a proper blade profile requires calculation of the blade row flow field in order to determine the velocities on the blade surfaces. An analysis theory is presented for several methods used for this calculation and associated computer programs that were developed are discussed.

Katsanis, T.

1973-01-01

264

Solving Electrical Distribution Problems Using Hybrid Evolutionary Data Analysis Techniques

Real-world electrical engineering problems can take advantage of the last Data Analysis methodologies. In this paper we will show that Genetic Fuzzy Rule-Based Systems and Genetic Programming techniques are good choices for tackling with some practical modeling problems. We claim that both evolutionary processes may produce good numerical results while providing us with a model that can be interpreted by

Oscar Cordón; Francisco Herrera; Luciano Sánchez

1999-01-01

265

NASA Astrophysics Data System (ADS)

The effects of specimen size on the compressive strength and Weibull modulus were investigated for nuclear graphite of different coke particle sizes: IG-110 and NBG-18 (average coke particle size for IG-110: 25 ?m, NBG-18: 300 ?m). Two types of cylindrical specimens, i.e., where the diameter to length ratio was 1:2 (ASTM C 695-91 type specimen, 1:2 specimen) or 1:1 (1:1 specimen), were prepared for six diameters (3, 4, 5, 10, 15, and 20 mm) and tested at room temperature (compressive strain rate: 2.08 × 10-4 s-1). Anisotropy was considered during specimen preparation for NBG-18. The results showed that the effects of specimen size appeared negligible for the compressive strength, but grade-dependent for the Weibull modulus. In view of specimen miniaturization, deviations from the ASTM C 695-91 specimen size requirements require an investigation into the effects of size for the grade of graphite of interest, and the specimen size effects should be considered for Weibull modulus determination.

Chi, Se-Hwan

2013-05-01

266

NASA Technical Reports Server (NTRS)

The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

Schmeckpeper, K. R.

1987-01-01

267

a Weibull distribution for the probability of fracture (Beremin 1983), or end up in similar expressions starting from an assumed Poisson distribution for cleavage triggering microcraks (Wal- lin et al. 1984 using distributions of micro- cracks situated on the plane of a main macrocrack. Detailed statistical

Ghoniem, Nasr M.

268

Analysis of the tropospheric water distribution during FIRE 2

NASA Technical Reports Server (NTRS)

The Penn State/NCAR mesoscale model, as adapted for use at ARC, was used as a testbed for the development and validation of cloud models for use in General Circulation Models (GCM's). This modeling approach also allows us to intercompare the predictions of the various cloud schemes within the same dynamical framework. The use of the PSU/NCAR mesoscale model also allows us to compare our results with FIRE-II (First International Satellite Cloud Climatology Project Regional Experiment) observations, instead of climate statistics. Though a promising approach, our work to date revealed several difficulties. First, the model by design is limited in spatial coverage and is only run for 12 to 48 hours at a time. Hence the quality of the simulation will depend heavily on the initial conditions. The poor quality of upper-tropospheric measurements of water vapor is well known and the situation is particularly bad for mid-latitude winter since the coupling with the surface is less direct than in summer so that relying on the model to spin-up a reasonable moisture field is not always successful. Though one of the most common atmospheric constituents, water vapor is relatively difficult to measure accurately, especially operationally over large areas. The standard NWS sondes have little sensitivity at the low temperatures where cirrus form and the data from the GOES 6.7 micron channel is difficult to quantify. For this reason, the goals of FIRE Cirrus II included characterizing the three-dimensional distribution of water vapor and clouds. In studying the data from FIRE Cirrus II, it was found that no single special observation technique provides accurate regional distributions of water vapor. The Raman lidar provides accurate measurements, but only at the Hub, for levels up to 10 km, and during nighttime hours. The CLASS sondes are more sensitive to moisture at low temperatures than are the NWS sondes, but the four stations only cover an area of two hundred kilometers on a side. The aircraft give the most accurate measurements of water vapor, but are limited in spatial and temporal coverage. This problem is partly alleviated by the use of the MAPS analyses, a four-dimensional data assimilation system that combines the previous 3-hour forecast with the available observations, but its upper-level moisture analyses are sometimes deficient because of the vapor measurement problem. An attempt was made to create a consistent four-dimensional description of the water vapor distribution during the second IFO by subjectively combining data from a variety of sources, including MAPS analyses, CLASS sondes, SPECTRE sondes, NWS sondes, GOES satellite analyses, radars, lidars, and microwave radiometers.

Westphal, Douglas L.

1993-01-01

269

An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

NASA Technical Reports Server (NTRS)

The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

Carter, M. C.; Madison, M. W.

1973-01-01

270

Finite key analysis for symmetric attacks in quantum key distribution

We introduce a constructive method to calculate the achievable secret key rate for a generic class of quantum key distribution protocols, when only a finite number n of signals is given. Our approach is applicable to all scenarios in which the quantum state shared by Alice and Bob is known. In particular, we consider the six state protocol with symmetric eavesdropping attacks, and show that for a small number of signals, i.e., below n{approx}10{sup 4}, the finite key rate differs significantly from the asymptotic value for n{yields}{infinity}. However, for larger n, a good approximation of the asymptotic value is found. We also study secret key rates for protocols using higher-dimensional quantum systems.

Meyer, Tim; Kampermann, Hermann; Kleinmann, Matthias; Bruss, Dagmar [Institut fuer Theoretische Physik III, Heinrich-Heine-Universitaet Duesseldorf, D-40225 Duesseldorf (Germany)

2006-10-15

271

Optimizing distributed practice: theoretical analysis and practical implications.

More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary, facts, and names of visual objects, with test delays up to 6 months. An optimal gap improved final recall by up to 150%. Both studies demonstrated nonmonotonic gap effects: Increases in gap caused test accuracy to initially sharply increase and then gradually decline. These results provide new constraints on theories of spacing and confirm the importance of cumulative reviews to promote retention over meaningful time periods. PMID:19439395

Cepeda, Nicholas J; Coburn, Noriko; Rohrer, Doug; Wixted, John T; Mozer, Michael C; Pashler, Harold

2009-01-01

272

Completion report harmonic analysis of electrical distribution systems

Harmonic currents have increased dramatically in electrical distribution systems in the last few years due to the growth in non-linear loads found in most electronic devices. Because electrical systems have been designed for linear voltage and current waveforms; (i.e. nearly sinusoidal), non-linear loads can cause serious problems such as overheating conductors or transformers, capacitor failures, inadvertent circuit breaker tripping, or malfunction of electronic equipment. The U.S. Army Center for Public Works has proposed a study to determine what devices are best for reducing or eliminating the effects of harmonics on power systems typical of those existing in their Command, Control, Communication and Intelligence (C3I) sites.

Tolbert, L.M.

1996-03-01

273

A social network analysis of customer-level revenue distribution

Social network analysis has been a topic of regular interest in the marketing discipline. Previous studies have largely focused\\u000a on similarities in product\\/brand choice decisions within the same social network, often in the context of product innovation\\u000a adoption. Not much is known, however, about the importance of social network effects once customers have been acquired. Using\\u000a the customer base of

Michael Haenlein

2011-01-01

274

Solving Electrical Distribution Problems Using Hybrid Evolutionary Data Analysis Techniques

Abstract. Real-world electrical engineering problems can take advantage of the last Data Analysis methodologies. In this paper we will show that Genetic Fuzzy Rule-Based Systems and Genetic Programming,techniques are good choices for tackling with some,practical modeling,problems. We claim that both evolutionary processes may produce good numerical results while providing us with a model,that can be interpreted by a human,being. We

Oscar Cordón; Francisco Herrera; Luciano Sánchez

1999-01-01

275

Distributed finite element analysis using a transputer network

NASA Technical Reports Server (NTRS)

The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

1989-01-01

276

The impact of recent precision measurements of DIS structure functions and inclusive jet production at the Tevatron on the global QCD analysis of parton distribution functions is studied in detail. Particular emphasis is placed on exploring the range of variation of the gluon distribution $G(x,Q)$ allowed by these new data. The strong coupling of $G(x,Q)$ with $\\alpha_s$ is fully taken into account. A new generation of CTEQ parton distributions, CTEQ4, is presented. It consists of the three standard sets (\\mbox{\\small {$\\overline {\\rm MS}$}}, {\\protect \\small DIS} and leading order), a series that gives a range of parton distributions with corresponding $\\alpha_s$'s, and a set with a low starting value of $Q$. Previously obtained gluon distributions that are consistent with the high $E_t$ jet cross-section are also discussed in the context of this new global analysis.

H. L. Lai; J. Huston; S. Kuhlmann; F. Olness; J. Owens; D. Soper; W. K. Tung; H. Weerts

1996-06-21

277

e're closer than you think: Portland's geographic location and Oregon's transportation infra - structure offer unmatched con - nectivity and time savings to international and domestic markets. Our economic development practices combine project-ready property with efficient, high-capacity infrastructure to create today's logistics advantages. Connecting people, places and products is the core of Portland's distribution and logistics industry sec - tor.

F. Gregory; B. Boyd; R. Bridges; D. Mitchell; J. Halsell; S. Fancher; D. King; R. Fore; E. Mango; D. Berlinrut; M. Leinbach; M. Maier; M. Wetmore; H. Herring; J. Guidi; M. Coolidge; J. Heald; T. Knox; D. Bartine; R. Bailey; H. Delgado; P. Conant; J. Madura; R. Thomas; F. Merceret; G. Allen; E. Bensman; R. Dittemore; N. Feldman; C. Boykin; H. Tileston; F. Brody; L. Hagerman; S. Pearson; L. Uccellini; W. Vaughan; J. Golden; D. Johnson; J. McQueen; B. Roberts; L. Freeman; G. Jasper; B. Hagemeyer; A. McCool; X. W. Proenza; S. Glover

2006-01-01

278

Reliability analysis of GFRP pultruded composite rods

The mechanical properties of FRP composites have a remarkable scatter, even when the specimens are prepared and tested under identical conditions. Proposes a new computerized method for estimating accurately the parameters (X0, ?, ?) of Weibull distribution function. Calculates the safe design of fatigue life data of glass fibre (ER 1150 F-183) reinforced polyester (Q 8520 A) pultruded composite rods

M. H. Abdallah; Enayat M. Abdin; A. I. Selmy; U. A. Khashaba

1996-01-01

279

Statistical analysis of slow crack growth experiments

A common approach for the determination of slow crack growth (SCG) parameters are the static and dynamic loading method. Since materials with small Weibull module show a large variability in strength, a correct statistical analysis of the data is indispensable. In this work we propose the use of the Maximum Likelihood Method and a Baysian Analysis, which, in contrast to

Tobias Pfingsten; Karsten Glien

2006-01-01

280

Evolution History of Asteroid Itokawa Based on Block Distribution Analysis

NASA Astrophysics Data System (ADS)

This work investigates trends in the global and regional distribution of blocks on asteroid 25143 Itokawa in order to discover new findings to better understand the history of this asteroid. Itokawa is a near-Earth object, and the first asteroid that was targeted for a sample return mission. Trends in block population provide new insights in regards to Itokawa's current appearance following the disruption of a possible parent body, and how its surface might have changed since then. Here blocks are defined as rocks or features with distinctive positive relief that are larger than a few meters in size. The size and distribution of blocks are measured by mapping the outline of the blocks using the Small Body Mapping Tool (SBMT) created by the Johns Hopkins University Applied Physics Laboratory [1]. The SBMT allows the user to overlap correctly geo-located Hayabusa images [2] onto the Itokawa shape model. This study provides additional inferences on the original disruption and subsequent re-accretion of Itokawa's "head" and "body" from block analyses. A new approach is taken by analyzing the population of blocks with respect to latitude for both Itokawa's current state, and a hypothetical elliptical body. Itokawa currently rotates approximately about its maximum moment of inertia, which is expected due to conservation of momentum and minimum energy arguments. After the possible disruption of the parent body of Itokawa, the "body" of Itokawa would have tended to a similar rotation. The shape of this body is made by removing the head of Itokawa and applying a semispherical cap. Using the method of [3] inertial properties of this object are calculated. With the assumption that this object had settled to its stable rotational axis, it is found that the pole axis could have been tilted about 13° away from the current axis in the direction opposite the head, equivalent to a 33 meter change in the center of mass. The results of this study provide means to test the hypothesis of whether or not Itokawa is a contact binary. References: [1] E. G. Kahn, et al. A tool for the visualization of small body data. In LPSC XLII, 2011. [2] A. Fujiwara, et al. The rubble-pile asteroid Itokawa as observed by Hayabusa. Science, 312(5778):1330-1334, June 2006. [3] A. F. Cheng, et al. Small-scale topography of 433 Eros from laser altimetry and imaging. Icarus, 155(1):51-74, 2002

Mazrouei, Sara; Daly, Michael; Barnouin, Olivier; Ernst, Carolyn

2013-04-01

281

Analysis of Drying Kinetics and Moisture Distribution in Convective Textile Fabric Drying

The drying process of crude cotton fabric is analyzed under two main aspects: analysis of moisture distribution inside the textile sheet, and analysis of certain operational convective drying process variables. Experimental apparatus consisted of a drying chamber in which samples of pure cotton textile were suspended inside the drying chamber and exposed to a convective hot air flow. The influence

Luiza Helena C. D. Sousa; Oswaldo. C. Motta Lima; Nehemias C. Pereira

2006-01-01

282

1 Composition and On Demand Deployment of Distributed Brain Activity Analysis Application on Global are brain science and high-energy physics. The analysis of brain activity data gathered from the MEG and analyze brain functions and requires access to large-scale computational resources. The potential platform

Abramson, David

283

Distributed Control and Stochastic Analysis of Hybrid Systems Supporting Safety Critical Real Safety Critical Real-Time Systems Design (HYBRIDGE) DOCUMENT CHANGE LOG Version # Issue Date Sections-Time Systems Design WP3: Reachability analysis for probabilistic hybrid systems Probabilistic Aircraft Conflict

Del Moral , Pierre

284

Unraveling the distributed neural code of facial identity through spatiotemporal pattern analysis

Unraveling the distributed neural code of facial identity through spatiotemporal pattern analysis. The present study investigates the neural code of facial identity perception with the aim of ascertain- ing-based brain mapping and dynamic discrimi- nation analysis to locate spatiotemporal patterns that support face

Behrmann, Marlene

285

Performance analysis of a fault-tolerant distributed multimedia server

NASA Astrophysics Data System (ADS)

The evolving demands of networks to support Webtone, H.323, AIN and other advanced services require multimedia servers that can deliver a number of value-added capabilities such as to negotiate protocols, deliver network services, and respond to QoS requests. The server is one of the primary limiters on network capacity. THe next generation server must be based upon a flexible, robust, scalable, and reliable platform to keep abreast with the revolutionary pace of service demand and development while continuing to provide the same dependability that voice networks have provided for decades. A new distributed platform, which is based upon the Totem fault-tolerant messaging system, is described. Processor and network resources are modeled and analyzed. Quantitative results are presented that assess this platform in terms of messaging capacity and performance for various architecture and design options including processing technologies and fault-tolerance modes. The impacts of fault-tolerant messaging are identified based upon analytical modeling of the proposed server architecture.

Derryberry, Barbara

1998-12-01

286

Spectral Energy Distribution Analysis of Luminous Infrared Galaxies from GOALS

NASA Astrophysics Data System (ADS)

The spectral energy distributions (SEDs) of the local luminous and ultraluminous infrared galaxies (LIRGs and ULIRGs) were thought to be well understood and exemplified by that of Arp 220, the "poster child" of these objects; but in fact, Arp 220 has been shown to be special in more than one way. Here we present comprehensive SEDs (from radio through x-ray) for the 88 most luminous (U)LIRGs in the Great Observatories All-sky LIRG Survey (GOALS), which combines multiwavelength imaging and spectroscopic data from space telescopes (Spitzer, HST, GALEX, and Chandra) in an effort to fully understand galaxy evolution processes and the enhanced infrared emission in the local universe. Spanning the luminosity range 11.4 < log(L_ir/L_sun) < 12.5, our objects are a complete subset of the flux-limited IRAS Revised Bright Galaxy Sample. To complement spacecraft data, we also took optical imaging data from Mauna Kea as well as searched through literature in order to compile accurate and consistent photometry and fully characterize the spectral shapes of the SEDs. We then analyzed the ratios of the radio, infrared, optical, and x-ray emission as a function of infrared luminosity and discussed the trends observed.

U, Vivian; Sanders, D.; Evans, A.; Mazzarella, J.; Armus, L.; Iwasawa, K.; Vavilkin, T.; Surace, J.; Howell, J.; GOALS Team

2009-05-01

287

Distribution-function analysis of mesoscopic hopping conductance fluctuations

NASA Astrophysics Data System (ADS)

Variable-range hopping (VRH) conductance fluctuations in the gate-voltage characteristics of mesoscopic gallium arsenide and silicon transistors are analyzed by means of their full distribution functions (DF's). The forms of the DF predicted by the theory of Raikh and Ruzin have been verified under controlled conditions for both the long, narrow wire and the short, wide channel geometries. The variation of the mean square fluctuation size with temperature in wires fabricated from both materials is found to be described quantitatively by Lee's model of VRH along a one-dimensional chain. Armed with this quantitative validation of the VRH model, the DF method is applied to the problem of magnetoconductance in the insulating regime. Here a nonmonotonic variation of the magnetoconductance is observed in silicon metal-oxide-semiconductor field-effect transistors whose sign at low magnetic fields is dependent on the channel geometry. The origin of this effect is discussed within the framework of the interference model of VRH magnetoconductance in terms of a narrowing of the DF in a magnetic field.

Hughes, R. J. F.; Savchenko, A. K.; Frost, J. E. F.; Linfield, E. H.; Nicholls, J. T.; Pepper, M.; Kogan, E.; Kaveh, M.

1996-07-01

288

Classification of cerebral lymphomas and glioblastomas featuring luminance distribution analysis.

Differentiating lymphomas and glioblastomas is important for proper treatment planning. A number of works have been proposed but there are still some problems. For example, many works depend on thresholding a single feature value, which is susceptible to noise. In other cases, experienced observers are required to extract the feature values or to provide some interactions with the system. Even if experts are involved, interobserver variance becomes another problem. In addition, most of the works use only one or a few slice(s) because 3D tumor segmentation is time consuming. In this paper, we propose a tumor classification system that analyzes the luminance distribution of the whole tumor region. Typical cases are classified by the luminance range thresholding and the apparent diffusion coefficients (ADC) thresholding. Nontypical cases are classified by a support vector machine (SVM). Most of the processing elements are semiautomatic. Therefore, even novice users can use the system easily and get the same results as experts. The experiments were conducted using 40 MRI datasets. The classification accuracy of the proposed method was 91.1% without the ADC thresholding and 95.4% with the ADC thresholding. On the other hand, the baseline method, the conventional ADC thresholding, yielded only 67.5% accuracy. PMID:23840280

Yamasaki, Toshihiko; Chen, Tsuhan; Hirai, Toshinori; Murakami, Ryuji

2013-01-01

289

A Weibull Regression Model with Gamma Frailties for Multivariate Survival Data

Frequently in the analysis of survival data, survival times within the same group are correlated due to unobserved co-variates.\\u000a One way these co-variates can be included in the model is as frailties. These frailty random block effects generate dependency\\u000a between the survival times of the individuals which are conditionally independent given the frailty. Using a conditional proportional\\u000a hazards model, in

Sujit K. Sahu; Dipak K. Dey; Helen Aslanidou; Debajyoti Sinha

1997-01-01

290

Feature Extraction from Degree Distribution for Comparison and Analysis of Complex Networks

The degree distribution is an important characteristic of complex networks. In many data analysis applications, the networks should be represented as fixed-length feature vectors and therefore the feature extraction from the degree distribution is a necessary step. Moreover, many applications need a similarity function for comparison of complex networks based on their degree distributions. Such a similarity measure has many applications including classification and clustering of network instances, evaluation of network sampling methods, anomaly detection, and study of epidemic dynamics. The existing methods are unable to effectively capture the similarity of degree distributions, particularly when the corresponding networks have different sizes. Based on our observations about the structure of the degree distributions in networks over time, we propose a feature extraction and a similarity function for the degree distributions in complex networks. We propose to calculate the feature values based on the mean an...

Aliakbary, Sadegh; Movaghar, Ali

2014-01-01

291

Early changes in thallium distribution. Effect on quantitative analysis

Thirty-two patients with coronary artery disease and an abnormality on an initial anterior view thallium scan had repeat images obtained after delays of 30 and 240 minutes. Scans were analyzed by quantitative criteria. Comparison of the initial stress study with the 30-minute redistribution scan showed no significant change in 11 patients, defects becoming smaller in 13 patients, and defects becoming larger in eight patients. When comparing the stress or the early redistribution images with the late redistribution scans, the diagnosis (eg, scar vs. ischemia) would have been affected in 14 cases. Analysis of the sources of variability showed that all the apparent worsening but only part of the defect resolution could be explained by variability inherent to repositioning the patient. Thus, the size of an initial defect is very sensitive to the time between the end of exercise and the onset of data collection and the nature of changes in scan appearance is complex.

Makler, P.T. Jr.; McCarthy, D.M.; Alavi, A.

1985-01-01

292

Distribution of Modelling Spatial Processes Using Geostatistical Analysis

NASA Astrophysics Data System (ADS)

The Geostatistical Analyst uses sample points taken at different locations in a landscape and creates (interpolates) a continuous surface. The Geostatistical Analyst provides two groups of interpolation techniques: deterministic and geostatistical. All methods rely on the similarity of nearby sample points to create the surface. Deterministic techniques use mathematical functions for interpolation. Geostatistics relies on both statistical and mathematical methods, which can be used to create surfaces and assess the uncertainty of the predictions. The first step in geostatistical analysis is variography: computing and modelling a semivariogram. A semivariogram is one of the significant functions to indicate spatial correlation in observations measured at sample locations. It is commonly represented as a graph that shows the variance in measure with distance between all pairs of sampled locations. Such a graph is helpful to build a mathematical model that describes the variability of the measure with location. Modeling of relationship among sample locations to indicate the variability of the measure with distance of separation is called semivariogram modelling. It is applied to applications involving estimating the value of a measure at a new location. Our work presents the analysis of the data following the steps as given below: identification of data set periods, constructing and modelling the empirical semivariogram for single location and using the Kriging mapping function as modelling of TEC maps in mid-latitude during disturbed and quiet days. Based on the semivariogram, weights for the kriging interpolation are estimated. Additional observations do, in general, not provide relevant extra information to the interpolation, because the spatial correlation is well described with the semivariogram. Keywords: Semivariogram, Kriging, modelling, Geostatistics, TEC

Grynyshyna-Poliuga, Oksana; Stanislawska, Iwona; Swiatek, Anna

293

Assessing Ensemble Filter Estimates of the Analysis Error Distribution of the Day

NASA Astrophysics Data System (ADS)

Ensemble data assimilation algorithms (e.g., the Ensemble Kalman Filter) are often purported to return an estimate of the "analysis error distribution of the day"; a measure of the variability in the analysis that is consistent with the current state of the system. In this presentation, we demonstrate that in the presence of non-linearity this is not, in fact, the case. The true error distribution of the day given today's observations consists of the Bayesian posterior PDF formed via the conjunction of the prior forecast error distribution with the likelihood error distribution constructed from the observations of the day. In actuality, ensemble data assimilation algorithms return an estimate of the analysis error integrated over all prior realizations of the observations of the day. The result is consistent with the true posterior analysis uncertainty (as returned by a solution to Bayes) if the likelihood distribution produced by the observations of the day is approximately equal to the likelihood distribution integrated over all possible observations (or equivalently innovations).

Posselt, D. J.; Hodyss, D.; Bishop, C. H.

2013-12-01

294

NASA Astrophysics Data System (ADS)

This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective ``compression'' technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

Viñas, Adolfo F.; Gurgiolo, Chris

2009-01-01

295

NASA Technical Reports Server (NTRS)

This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

Gurgiolo, Chris; Vinas, Adolfo F.

2009-01-01

296

Integral-moment analysis of the BATSE gamma-ray burst intensity distribution

NASA Technical Reports Server (NTRS)

We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.

Horack, John M.; Emslie, A. Gordon

1994-01-01

297

Distributional Benefit Analysis of a National Air Quality Rule

Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA) must perform environmental justice (EJ) reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA’s Heavy Duty Diesel (HDD) rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups’ baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well. PMID:21776207

Post, Ellen S.; Belova, Anna; Huang, Jin

2011-01-01

298

Incidence, histopathologic analysis and distribution of tumours of the hand

Background The aim of this large collective and meticulous study of primary bone tumours and tumourous lesions of the hand was to enhance the knowledge about findings of traumatological radiographs and improve differential diagnosis. Methods This retrospective study reviewed data collected from 1976 until 2006 in our Bone Tumour Registry. The following data was documented: age, sex, radiological investigations, tumour location, histopathological features including type and dignity of the tumour, and diagnosis. Results The retrospective analysis yielded 631 patients with a mean age of 35.9?±?19.2 years. The majority of primary hand tumours were found in the phalanges (69.7%) followed by 24.7% in metacarpals and 5.6% in the carpals. Only 10.6% of all cases were malignant. The major lesion type was cartilage derived at 69.1%, followed by bone cysts 11.3% and osteogenic tumours 8.7%. The dominant tissue type found in phalanges and metacarpals was of cartilage origin. Osteogenic tumours were predominant in carpal bones. Enchondroma was the most commonly detected tumour in the hand (47.1%). Conclusions All primary skeletal tumours can be found in the hand and are most often of cartilage origin followed by bone cysts and osteogenic tumours. This study furthermore raises awareness about uncommon or rare tumours and helps clinicians to establish proper differential diagnosis, as the majority of detected tumours of the hand are asymptomatic and accidental findings on radiographs. PMID:24885007

2014-01-01

299

Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

NASA Astrophysics Data System (ADS)

(note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

Singh, R.; Percivall, G.

2009-12-01

300

Hyperdimensional Analysis of Amino Acid Pair Distributions in Proteins

Our manuscript presents a novel approach to protein structure analyses. We have organized an 8-dimensional data cube with protein 3D-structural information from 8706 high-resolution non-redundant protein-chains with the aim of identifying packing rules at the amino acid pair level. The cube contains information about amino acid type, solvent accessibility, spatial and sequence distance, secondary structure and sequence length. We are able to pose structural queries to the data cube using program ProPack. The response is a 1, 2 or 3D graph. Whereas the response is of a statistical nature, the user can obtain an instant list of all PDB-structures where such pair is found. The user may select a particular structure, which is displayed highlighting the pair in question. The user may pose millions of different queries and for each one he will receive the answer in a few seconds. In order to demonstrate the capabilities of the data cube as well as the programs, we have selected well known structural features, disulphide bridges and salt bridges, where we illustrate how the queries are posed, and how answers are given. Motifs involving cysteines such as disulphide bridges, zinc-fingers and iron-sulfur clusters are clearly identified and differentiated. ProPack also reveals that whereas pairs of Lys residues virtually never appear in close spatial proximity, pairs of Arg are abundant and appear at close spatial distance, contrasting the belief that electrostatic repulsion would prevent this juxtaposition and that Arg-Lys is perceived as a conservative mutation. The presented programs can find and visualize novel packing preferences in proteins structures allowing the user to unravel correlations between pairs of amino acids. The new tools allow the user to view statistical information and visualize instantly the structures that underpin the statistical information, which is far from trivial with most other SW tools for protein structure analysis. PMID:22174733

Henriksen, Svend B.; Arnason, Omar; Soring, Jon; Petersen, Steffen B.

2011-01-01

301

Theoretical analysis of the steady state particle size distribution in limited breakage processes

NASA Astrophysics Data System (ADS)

The steady state particle size distribution is examined, resulting from a breakage process with a maximum stable size and a homogeneous continuous kernel. The dynamic breakage problem is transformed into one that allows direct solutions for the steady state distribution. The latter depends on the breakage kernel and on the ratio of critical to initial size. As this ratio goes to zero the steady state distribution approaches its limiting form obtained by the authors previously [7]. A general theoretical analysis concerning the steady state distribution is presented herein. The asymptotic behaviour is determined with regard to various limits. Perturbation analysis for nearly uniform kernels reveals several interesting features of the problem. For the general continuous kernel, the problem can be cast in a matrix form amenable to a conventional theoretical treatment. Finally, comparisons of the new results with existing solutions of the dynamic problem, for large times, confirm their validity.

Kostoglou, M.; Karabelas, A. J.

1998-11-01

302

A new statistical distribution for characterizing the random strength of brittle materials

A new three-parameter statistical distribution is offered for the description of random strength of a brittle material. The\\u000a distribution allows characterization of a wide range of relations regarding the strength-size effect. Thus, in contrast to\\u000a the Weibull distribution, the non-linear character between the logarithm of average strength and the logarithm of the specimen\\u000a size may be described, while retaining the

M. R GURVICH; A. T DiBENEDETTO; S. V RANADE

1997-01-01

303

A Graphical Paradigm for the Design and Analysis of Distributed Systems

A graphical paradigm for the design, representation, and analysis of distributed systems is discussed. For the purposes of this paper, we use the term distributed system to define client\\/server, peer-to-peer, multi-tier and other non-traditional hardware\\/software solutions that are typified by applications which employ the resources of multiple platforms to perform business processes rather than being traditional mainframe solutions. The primary

H. Pat Artis

1994-01-01

304

A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043

Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.

2009-01-01

305

In order to discuss the effect of different distribution of components concentration on the accuracy of quantitative spectral analysis, according to the Lambert-Beer law, ideal absorption spectra of samples with three components were established. Gaussian noise was added to the spectra. Correction and prediction models were built by partial least squares regression to reflect the unequal modeling and prediction results between different distributions of components. Results show that, in the case of pure linear absorption, the accuracy of model is related to the distribution of components concentration. Not only to the component we focus on, but also to the non-tested components, the larger covered and more uniform distribution is a significant point of calibration set samples to establish a universal model and provide a satisfactory accuracy. This research supplies a theoretic guidance for reasonable choice of samples with suitable concentration distribution, which enhances the quality of model and reduces the prediction error of the predict set. PMID:23016350

Li, Gang; Zhao, Zhe; Wang, Hui-Quan; Lin, Ling; Zhang, Bao-Ju; Wu, Xiao-Rong

2012-07-01

306

NASA Astrophysics Data System (ADS)

The earthquake spatial distribution is being studied, using earthquake catalogs from different seismic regions (California, Canada, Central Asia, Greece, and Japan). The quality of the available catalogs, taking into account the completeness of the magnitude, is examined. Based on the analysis of the catalogs, it was determined that the probability densities of the inter-event distance distribution collapse into single distribution when the data is rescaled. The collapse of the data provides a clear illustration of earthquake-occurrence self-similarity in space.

Marekova, Elisaveta

2014-12-01

307

Analysis and synthesis of distributed-lumped-active networks by digital computer

NASA Technical Reports Server (NTRS)

The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

1973-01-01

308

NASA Astrophysics Data System (ADS)

The earthquake spatial distribution is being studied, using earthquake catalogs from different seismic regions (California, Canada, Central Asia, Greece, and Japan). The quality of the available catalogs, taking into account the completeness of the magnitude, is examined. Based on the analysis of the catalogs, it was determined that the probability densities of the inter-event distance distribution collapse into single distribution when the data is rescaled. The collapse of the data provides a clear illustration of earthquake-occurrence self-similarity in space.

Marekova, Elisaveta

2014-08-01

309

Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

NASA Astrophysics Data System (ADS)

We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

2012-07-01

310

NASA Astrophysics Data System (ADS)

In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.

Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko

311

A method is introduced in which tensile tests can be performed in situ on micromachined structures. The testing equipment consists of a testing unit mounted on a micromanipulator in a scanning electron microscope. The fracture loads of micromachined beam structures made from thick and thin film polysilicon were measured, and the fracture strengths were then calculated via measurements of the

Staffan Greek; Fredric Ericson; Stefan Johansson; Jan-Åke Schweitz

1997-01-01

312

Numerous strategies have been advocated to reduce the potential for plastic shrinkage cracking in concrete through mixture\\u000a proportioning, curing methods, or the use of fiber reinforcement. The effectiveness of each approach must be adequately quantified\\u000a to determine whether the additional initial cost of each strategy is justified. The majority of current research to characterize\\u000a plastic shrinkage cracking in concrete relies

C. Qi; J. Weiss; J. Olek

2003-01-01

313

Using CLIMEX and the Taguchi Method, a process-based niche model was developed to estimate potential distributions of Phoenix dactylifera L. (date palm), an economically important crop in many counties. Development of the model was based on both its native and invasive distribution and validation was carried out in terms of its extensive distribution in Iran. To identify model parameters having greatest influence on distribution of date palm, a sensitivity analysis was carried out. Changes in suitability were established by mapping of regions where the estimated distribution changed with parameter alterations. This facilitated the assessment of certain areas in Iran where parameter modifications impacted the most, particularly in relation to suitable and highly suitable locations. Parameter sensitivities were also evaluated by the calculation of area changes within the suitable and highly suitable categories. The low temperature limit (DV2), high temperature limit (DV3), upper optimal temperature (SM2) and high soil moisture limit (SM3) had the greatest impact on sensitivity, while other parameters showed relatively less sensitivity or were insensitive to change. For an accurate fit in species distribution models, highly sensitive parameters require more extensive research and data collection methods. Results of this study demonstrate a more cost effective method for developing date palm distribution models, an integral element in species management, and may prove useful for streamlining requirements for data collection in potential distribution modeling for other species as well. PMID:24722140

Shabani, Farzin; Kumar, Lalit

2014-01-01

314

NASA Astrophysics Data System (ADS)

Grain size is one of the factors which influence mechanical properties of metals like strength and fracture toughness. Ultrasonic waves propagating in polycrystalline materials are subject to attenuation dominated by grain boundary scattering. The importance of grain size estimation for industrial applications warrants the investigation of alternative methods of nondestructive grain size determination. Analysis of the power-law behavior of ultrasonic attenuation experimental data is used to link the wavelength dependence of the attenuation coefficient directly to the grain size distribution. The outcome is a simple relationship between the power-law which describes the grain size distribution and the power-law dependence of attenuation on wavelength. Justifications for the use of the power-law for the grain size distribution include scaling and self-similarity. Careful attention is given to the limitations in terms of a practical grain size distribution with finite limits. Two types of measurements are presented to verify the theoretical development: grain size distribution and ultrasonic attenuation. Nickel samples were prepared using three different annealing durations. The attenuation exponent is experimentally shown to be an appropriate nondestructive measurement of the grain size distribution exponent. Further scaling properties for different annealing durations are also explored. A nondestructive evaluation procedure is suggested for metal samples with identical grain size distribution exponents, where the shifts of the log-log representations of the attenuation curves can be used to characterize the different grain size distributions.

Bilgutay, N.; Onaral, B.; Nicoletti, D.

1992-12-01

315

Generalized recurrence plots for the analysis of images from spatially distributed systems

We propose a new method for the analysis of images showing patterns emerging from the evolution of spatially distributed systems. The generalized recurrence plot (GRP) and the generalized recurrence quantification analysis (GRQA) are exploited for the investigation of such patterns.We focus on snapshots of spatio-temporal processes such as the formation of Turing structures and traveling waves in the Belousov–Zhabotinsky reaction,

Angelo Facchini; Chiara Mocenni; Antonio Vicino

2009-01-01

316

This paper provides an in-depth theoretical analysis of subcarrier multiplexed quantum key distribution (SCM-QKD) systems, taking into account as many factors of impairment as possible and especially considering the influence of nonlinear signal mixing on the end-to-end quantum bit error rate (QBER) and the useful key rate. A detailed analysis of SCM-QKD is performed considering the different factors affecting the

JosÉ Capmany; Arturo Ortigosa-Blanch; JosÉ Mora; Antonio Ruiz-Alba; Waldimar Amaya; Alfonso MartÍnez

2009-01-01

317

The analysis of reaction time (RT) distributions has become a recognized standard in studies on the stimulus response correspondence (SRC) effect as it allows exploring how this effect changes as a function of response speed. In this study, we compared the spatial SRC effect (the classic Simon effect) with the motion SRC effect using RT distribution analysis. Four experiments were conducted, in which we manipulated factors of space position and motion for stimulus and response, in order to obtain a clear distinction between positional SRC and motion SRC. Results showed that these two types of SRC effects differ in their RT distribution functions as the space positional SRC effect showed a decreasing function, while the motion SRC showed an increasing function. This suggests that different types of codes underlie these two SRC effects. Potential mechanisms and processes are discussed. PMID:24605178

Styrkowiec, Piotr; Szczepanowski, Remigiusz

2013-01-01

318

The distributed dislocation method applied to the analysis of elastoplastic strain concentrations

NASA Astrophysics Data System (ADS)

Conventionally, the use of continuous distributions of dislocations to model plasticity has been confined to the analysis of crack tip plasticity using linear arrays of dislocations, within the framework of plane analysis. By expanding this technique into a distribution of dislocation over an area, a method is developed to model the plasticity at stress raising features such as notches or holes under plane strain conditions. The method explicitly takes account of the boundary conditions by using a dislocation solution which accounts for the presence of the stress-raise itself. Other free boundaries may be modelled more approximately using boundary elements which also correctly include the presence of the stress raiser. The dislocations are distributed over finite sized cells, and the solutions found for the strain fields compare favourably with both finite element and bounding Neuber and Glinka results.

Blomerus, P. M.; Hills, D. a.; Kelly, P. a.

1999-04-01

319

Power density distribution and associated thermal analysis of an elliptical polarizing undulator

NASA Astrophysics Data System (ADS)

The Taiwan Photon Source (TPS) project is planning to construct a third-generation synchrotron accelerator in Taiwan. This 3 GeV, 500 mA high-energy accelerator will support 20 beamlines for insertion devices (ID) and 24 beamlines for bending magnets (BM). We will undertake an in-depth investigation of the power density distribution and thermal analysis of the accelerator as an important part in the design of the photon absorbers, masks, and mirrors at the front end and in the beamline areas. An elliptically polarizing undulator (EPU) will be one of the primary ID sources for this accelerator and will have a different power density distribution from the other sources. We have derived a general elliptical polarizing EPU power density distribution for this source and have performed a mathematical approximation suitable for practical engineering applications. We will also present a thermal analysis of the helical undulator using Gaussian and unit step-function approximations.

Sheng, I. C.; Kuan, C. K.

2011-04-01

320

Quantitative Analysis of Circumferential Plaque Distribution in Human Coronary Arteries in

Quantitative Analysis of Circumferential Plaque Distribution in Human Coronary Arteries in Relation Â· Symptomatic coronary artery disease and atherosclerosis are among the leading causes of death in many to be understood. Â· 3-D Fusion of x-ray coronary angiography and intravascular ultrasound (IVUS) data allows

Wahle, Andreas

321

Analysis of a cone-based distributed topology control algorithm for wireless multi-hop networks

The topology of a wireless multi-hop network can be controlled by varying the transmission power at each node. In this paper, we give a detailed analysis of a cone-based distributed topology control algorithm. This algorithm, introduced in [16], does not assume that nodes have GPS information available; rather it depends only on directional information. Roughly speaking, the basic idea of

Li Li; Joseph Y. Halpern; Paramvir Bahl; Yi-Min Wang; Roger Wattenhofer

2001-01-01

322

Determination and analysis of distribution coefficients of 137Cs in soils from Biscay (Spain)

The distribution coefficient of 137Cs has been determined in 58 soils from 12 sampling points from Biscay by treating 10 g with 25 ml of an aqueous solution with an activity of 1765 Bq in the radionuclide, by shaking during 64 h and measuring the residual activity with a suitable detector. Soils were characterised by sampling depth, particle size analysis

C. Elejalde; M. Herranz; F. Legarda; F. Romero

2000-01-01

323

Analysis of repetitive DNA distribution patterns in the Tribolium castaneum genome

BACKGROUND: Insect genomes vary widely in size, a large fraction of which is often devoted to repetitive DNA. Re-association kinetics indicate that up to 42% of the genome of the red flour beetle, Tribolium castaneum, is repetitive. Analysis of the abundance and distribution of repetitive DNA in the recently sequenced genome of T. castaneum is important for understanding the structure

Suzhi Wang; Marcé D Lorenzen; Richard W Beeman; Susan J Brown

2008-01-01

324

The load analysis for the distribution system and facilities has relied on measurement equipment. Moreover, load monitoring incurs huge costs in terms of installation and main- tenance. This paper presents a new model to analyze wherein facilities load under a feeder every 15 min using meter-reading data that can be obtained from a power consumer every 15 min or a

Jin-Ho Shin; Bong-Jae Yi; Young-Il Kim; Heon-Gyu Lee; Keun Ho Ryu

2011-01-01

325

Pitch angle distribution analysis of radiation belt electrons based on Combined Release and Radiation Effects Satellite Medium Electrons A data J. L. Gannon,1 X. Li,1 and D. Heynderickx2 Received 13 the Medium Electrons A instrument on the Combined Release and Radiation Effects Satellite (CRRES), a survey

Li, Xinlin

326

Stochastic center manifold analysis in scalar nonlinear systems involving distributed delays

Stochastic center manifold analysis in scalar nonlinear systems involving distributed delays of stochasticity and timing constraints on the behavior of non-linear systems, especially near dynamic that noise changes significantly the stability of delayed nonlinear systems and fur- ther, that noise

Boyer, Edmond

327

Global Distribution of Tropospheric Aerosols: A 3-D Model Analysis of Satellite Data

NASA Technical Reports Server (NTRS)

This report describes objectives completed for the GACP (Global Climatology Aerosol Project). The objectives included the analysis of satellite aerosol data, including the optical properties and global distributions of major aerosol types, and human contributions to major aerosol types. The researchers have conducted simulations and field work.

Chin, Mian

2002-01-01

328

Many areas of ecological inquiry require the ability to detect and characterize change in ecological variables across both space and time. The purpose of this study was to investigate ways in which geographic boundary analysis techniques could be used to characterize the pattern of change over space in plant distributions in a forested wetland mosaic. With vegetation maps created using

Kimberly R. Hall; Susan L. Maruca

2001-01-01

329

The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values

165 The Beach Study: An Empirical Analysis of the Distribution of Coastal Property Values empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have empirical evidence suggests that coastal properties, and particularly those proximate to a beach, have

Omiecinski, Curtis

330

Overlap distributions and taxonomy analysis of spin glass states with equal weights

499 Overlap distributions and taxonomy analysis of spin glass states with equal weights N. Parga) RÃ©sumÃ©. 2014 Nous utilisons des techniques de taxonomie numÃ©rique pour vÃ©rifier l'ultramÃ©tricitÃ© des entre Ã©chantillons disparaissent. Abstract. 2014 Techniques of numerical taxonomy are used to make

Paris-Sud XI, UniversitÃ© de

331

Modeling and Performance Analysis of a Microturbine as a Distributed Energy Resource

This paper presents modeling, simulation, and analysis of load following behavior of a microturbine (MT) as a distributed energy resource (DER). The MT-generator (MTG) system consists of the MT coupled to a synchronous generator. Simulation is done in MATLAB for different loading conditions under islanded and grid-connected modes. The MTG model also incorporates a speed controller for maintaining constant speed

A. K. Saha; S. Chowdhury; P. A. Crossley

2009-01-01

332

Identification of Neutron Sources by Spectral Analysis of Pulse Height Distributions

This paper proposes a neutron source identification method based on the spectral analysis of neutron pulse height distributions obtained with liquid scintillation detectors. The fact that shielded and unshielded neutron sources have clearly defined spectral components with specific locations and intensities offers the possibility of identifying the sources based on spectral features alone, without having to unfold the energy spectra.

Senada Avdic; Predrag Marinkovic; Sara A. Pozzi; Marek Flaska; Vladimir Protopopescu

2009-01-01

333

affecting the uncertainty of delay estimates. The probability distributions have a slight effect 26, 2005 * Corresponding Author Paper Length 5,217 words in text 2,000 words in 8 exhibits 7,217 words total (max = 7500) #12;Ji and Prevedouros 2 ABSTRACT The uncertainty analysis of the HCM delay

Prevedouros, Panos D.

334

To address the problem of outlier detection in wireless sensor networks, in this paper we propose a robust principal component analysis based technique to detect anomalous or faulty sensor data in a distributed wireless sensor network with a focus on data integrity and accuracy problem. The main key features are that it considers the correlation existing among the sensor data

N. Chitradevi; V. Palanisamy; K. Baskaran; U. B. Nisha

2010-01-01

335

Hydraulic model analysis of water distribution system, Rockwell International, Rocky Flats, Colorado

Rockwell International requested an analysis of the existing plant site water supply distribution system at Rocky Flats, Colorado, to determine its adequacy. On September 26--29, 1988, Hughes Associates, Inc., Fire Protection Engineers, accompanied by Rocky Flats Fire Department engineers and suppression personnel, conducted water flow tests at the Rocky Flats plant site. Thirty-seven flows from various points throughout the plant

J. Perstein; J. A. Castellano

1989-01-01

336

Selecting the appropriate statistical distribution for the primary analysis: a case study

An article in The Lancet discussed a clinical trial of a product for a rare disease. The authors had modified the primary analysis from an unadjusted Wilcoxon rank sum test to a Poisson regression. This led to several questions including: Was a Poisson distribution appropriate? How were the covariates selected? What is the effect of outliers (there were some)? If

Peter A. Lachenbruch

2005-01-01

337

DISTRIBUTED VIRTUAL SYSTEM FOR DOLPHINS' SOUND ACQUISITION AND TIME FREQUENCY ANALYSIS

Dolphins are mammals whose sound emitting and hearing capabilities are very important tools for their underwater life. At the same time, dolphins' life activities assessment uses dolphins' emitted sounds to localize them and to better understand their habits and behavior. The present work reports a distributed virtual instrument based solution for dolphin sounds acquisition, transmission and analysis (Dolphin Monitoring Network).

Octavian Postolache; Pedro Silva Girão; Miguel Dias Pereira; Mário Figueiredo

338

Analysis and Design of Silicon Bipolar Distributed Oscillators Ali Hajimiri and Hui Wu

Analysis and Design of Silicon Bipolar Distributed Oscillators Ali Hajimiri and Hui Wu Abstract A systematic approach to design of silicon bipolar distrib- uted oscillators and VCOs is presented. The operation of dis- tributed oscillators is analyzed and the general condition for oscillation is derived

Hajimiri, Ali

339

Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory

A systematic analysis of transverse momentum distribution of hadrons produced in ultrarelativistic p+p and A+A collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.

Sena, I.; Deppman, A. [Instituto de Fisica - Universidade de Sao Paulo (Brazil)

2013-03-25

340

Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis

crystals, opening up the road to predict full fiber mechanics. INTRODUCTION Silk proteins build upMechanical Response of Silk Crystalline Units from Force-Distribution Analysis Senbo Xiao, Wolfram of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk

GrÃ¤ter, Frauke

341

A new method for the size-distribution analysis of polymers by sedimentation velocity analytical ultracentrifugation is described. It exploits the ability of Lamm equation modeling to discriminate between the spreading of the sedimentation boundary arising from sample heterogeneity and from diffusion. Finite element solutions of the Lamm equation for a large number of discrete noninteracting species are combined with maximum entropy

Peter Schuck

2000-01-01

342

An Ecological Analysis of the Geographic Distribution of Veterinarians in the United States

ERIC Educational Resources Information Center

Measures of the ecological characteristics of states were developed through factor analysis. Then ecological characteristics of states and cities were related to the geographic distribution of veterinarians and physicians. Population size is the strongest correlate of the number of health professionals. Results for pet veterinarians resemble…

Richards, James M., Jr.

1977-01-01

343

Analysis of laser induced acoustic pulse probing of charge distributions in dielectrics

energy conversion effi- ciency. 3) The influence of the elastic properties of the materialL-171 Analysis of laser induced acoustic pulse probing of charge distributions in dielectrics C'impulsions de pression sans variation notable de la forme de ces impulsions, avec un bon rendement de conversion

Boyer, Edmond

344

An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

ERIC Educational Resources Information Center

Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

Attali, Yigal

2010-01-01

345

Can Data Recognize Its Parent Distribution?

This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

A.W.Marshall; J.C.Meza; and I. Olkin

1999-05-01

346

NASA Astrophysics Data System (ADS)

Tests and calibration of sprayers have been considered a very important task for chemicals use reduction in agriculture and for improvement of plant phytosanitary protection. A reliable, affordable and easy-to-use method to observe the distribution in the field is required and the infrared thermoimage analysis can be considered as a potential method based on non-contact imaging technologies. The basic idea is that the application of colder water (10 °C less) than the leaves surface makes it possible to distinguish and measure the targeted areas by means of a infrared thermoimage analysis based on significant and time persistent thermal differences. Trials were carried out on a hedge of Prunus laurocerasus, 2.1 m height with an homogenous canopy. A trailed orchard sprayer was employed with different spraying configurations. A FLIR TM (S40) thermocamera was used to acquire (@ 50 Hz) thermal videos, in a fixed position, at frame rate of 10 images/s, for nearly 3 min. Distribution quality was compared to the temperature differences obtained from the thermal images between pre-treatment and post-treatment (?T)., according two analysis: time-trend of ?T average values for different hedge heights and imaging ?T distribution and area coverage by segmentation in k means clustering after 30 s of spraying. The chosen spraying configuration presented a quite good distribution for the entire hedge height with the exclusion of the lower (0-1 m from the ground) and the upper part (>1.9 m). Through the image segmentation performed of ?T image by k-means clustering, it was possible to have a more detailed and visual appreciation of the distribution quality among the entire hedge. The thermoimage analysis revealed interesting potentiality to evaluate quality distribution from orchards sprayers.

Menesatti, P.; Biocca, M.

2007-09-01

347

NASA Astrophysics Data System (ADS)

Reasonable prediction of landslide occurrences in a given area requires the choice of an appropriate probability distribution of recurrence time intervals. Although landslides are widespread and frequent in many parts of the world, complete databases of landslide occurrences over large periods are missing and often such natural disasters are treated as processes uncorrelated in time and, therefore, Poisson distributed. In this paper, we examine the recurrence time statistics of landslide events simulated by a cellular automaton model that reproduces well the actual frequency-size statistics of landslide catalogues. The complex time series are analysed by varying both the threshold above which the time between events is recorded and the values of the key model parameters. The synthetic recurrence time probability distribution is shown to be strongly dependent on the rate at which instability is approached, providing a smooth crossover from a power-law regime to a Weibull regime. Moreover, a Fano factor analysis shows a clear indication of different degrees of correlation in landslide time series. Such a finding supports, at least in part, a recent analysis performed for the first time of an historical landslide time series over a time window of fifty years.

Piegari, E.; Di Maio, R.; Avella, A.

2013-12-01

348

Spatial Latent Class Analysis Model for Spatially Distributed Multivariate Binary Data

A spatial latent class analysis model that extends the classic latent class analysis model by adding spatial structure to the latent class distribution through the use of the multinomial probit model is introduced. Linear combinations of independent Gaussian spatial processes are used to develop multivariate spatial processes that are underlying the categorical latent classes. This allows the latent class membership to be correlated across spatially distributed sites and it allows correlation between the probabilities of particular types of classes at any one site. The number of latent classes is assumed fixed but is chosen by model comparison via cross-validation. An application of the spatial latent class analysis model is shown using soil pollution samples where 8 heavy metals were measured to be above or below government pollution limits across a 25 square kilometer region. Estimation is performed within a Bayesian framework using MCMC and is implemented using the OpenBUGS software. PMID:20161235

Wall, Melanie M.; Liu, Xuan

2009-01-01

349

Work was done to study a hydride-dehydride method for producing uranium metal powder. Particle distribution analysis was conducted using digital microscopy and grayscale image analysis software. The particle size was found to be predominantly...

Sames, William

2011-08-08

350

Distribution characteristics analysis of sites around Taihu Lake basin based on DEM and ETM+

NASA Astrophysics Data System (ADS)

Many ancient sites of different cultural period distributed in the Taihu lake basin. The natural environment is the material foundation that the mankind relies for existence and development, so it's necessary to master the natural environment at that time in order to understand the culture of site. The paper analyzed morphological characteristics and temporal and spatial distribution of ancient site using GIS spatial analysis method which based on the remote sensing image and DEM data. Some terrain indexes were selected as the evaluation factor including altitude, slope, aspect, slope shape and surface rolling, which were used to analyze the relation between the spatial location of ancient site and the natural environment. The research shows the spatial location of ancient sites have a close relationship with natural environment by the terrain analysis and hydrological analysis. Especially, the water distribution plays an important role in the constraints of the distribution of ancient sites, meanwhile the ancients showed the capacity to adapt the environment. Throughout the development of the ancient sites, ancients and nature depend on each other for existence.

Yu, Lijun; Nie, Yuping; Zhang, Yan

2012-01-01

351

Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model

NASA Astrophysics Data System (ADS)

As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly growing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis (SA) method for snow cover fraction input data (SCF) for a distributed rainfall-runoff model to investigate if the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focused on the relation between the SCF sensitivity and the physical, spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland for which a distributed WetSpa model is setup to simulate two years of daily runoff. The SA uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which uses different response functions for each 4 km × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as: geomorphology, soil texture, land-use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for the spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model.

Berezowski, T.; Nossent, J.; Chorma?ski, J.; Batelaan, O.

2014-10-01

352

The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and ? parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 ?l/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 ?l/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p?0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. PMID:23273476

de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

2013-03-01

353

Distributed hot-wire anemometry based on Brillouin optical time-domain analysis.

A distributed hot-wire anemometer based on Brillouin optical time-domain analysis is presented. The anemometer is created by passing a current through a stainless steel tube fibre bundle and monitoring Brillouin frequency changes in the presence of airflow. A wind tunnel is used to provide laminar airflow while the device response is calibrated against theoretical models. The sensitivity equation for this anemometer is derived and discussed. Airspeeds from 0 m/s to 10 m/s are examined, and the results show that a Brillouin scattering based distributed hot-wire anemometer is feasible. PMID:22772259

Wylie, Michael T V; Brown, Anthony W; Colpitts, Bruce G

2012-07-01

354

Global QCD Analysis of Parton Structure of the Nucleon: CTEQ5 Parton Distributions

An up-to-date global QCD analysis of high energy lepton-hadron and hadron-hadron interactions is performed to better determine the gluon and quark parton distributions in the nucleon. Improved experimental data on inclusive jet production, in conjunction with precise deep inelastic scattering data, place good constraints on the gluon over a wide range of x; while new data on asymmetries in Drell-Yan processes contribute to better determine the d/u ratio. Comparisons with results of other recent global analyses are made, and the differences are described. Open issues and the general problem of determining the uncertainties of parton distributions are discussed.

H. L. Lai; J. Huston; S. Kuhlmann; J. Morfin; F. Olness; J. F. Owens; J. Pumplin; W. K. Tung

1999-03-08

355

NASA Astrophysics Data System (ADS)

Statistical and fractal properties of the spatial distribution of earthquakes in the central zone of Chile are studied. In particular, data are shown to behave according to the well-known Gutenberg-Richter law. The fractal structure is evident for epicenters, not for hypocenters. The multifractal spectrum is also determined, both for the spatial distribution of epicenters and hypocenters. For negative values of the index of multifractal measure q, the multifractal spectrum, which usually cannot be reliably found from data, is calculated from a generalized Cantor-set model, which fits the multifractal spectrum for q>0, a technique which has been previously applied for analysis of solar wind data.

Pastén, Denisse; Muñoz, Víctor; Cisternas, Armando; Rogan, José; Valdivia, Juan Alejandro

2011-12-01

356

NASA Technical Reports Server (NTRS)

Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

Leybold, H. A.

1971-01-01

357

Reliability Analysis of Uniaxially Ground Brittle Materials

NASA Technical Reports Server (NTRS)

The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

1995-01-01

358

Strain analysis from objects with a random distribution: A generalized center-to-center method

NASA Astrophysics Data System (ADS)

Existing methods of strain analysis such as the center-to-center method and the Fry method estimate strain from the spatial relationship between point objects in the deformed state. They assume a truncated Poisson distribution of point objects in the pre-deformed state. Significant deviations occur in nature and diffuse the central vacancy in a Fry plot, limiting the its effectiveness as a strain gauge. Therefore, a generalized center-to-center method is proposed to deal with point objects with the more general Poisson distribution, where the method outcomes do not depend on an analysis of a graphical central vacancy. This new method relies upon the probability mass function for the Poisson distribution, and adopts the maximum likelihood function method to solve for strain. The feasibility of the method is demonstrated by applying it to artificial data sets generated for known strains. Further analysis of these sets by use of the bootstrap method shows that the accuracy of the strain estimate has a strong tendency to increase either with point number or with the inclusion of more pre-deformation nearest neighbors. A poorly sorted, well packed, deformed conglomerate is analyzed, yielding strain estimate similar to the vector mean of the major axis directions of pebbles and the harmonic mean of their axial ratios from a shape-based strain determination method. These outcomes support the applicability of the new method to the analysis of deformed rocks with appropriate strain markers.

Shan, Yehua; Liang, Xinquan

2014-03-01

359

SpatTrack: An Imaging Toolbox for Analysis of Vesicle Motility and Distribution in Living Cells.

The endocytic pathway is a complex network of highly dynamic organelles, which has been traditionally studied by quantitative fluorescence microscopy. The data generated by this method can be overwhelming and its analysis, even for the skilled microscopist, is tedious and error-prone. We developed SpatTrack, an open source, platform-independent program collecting a variety of methods for analysis of vesicle dynamics and distribution in living cells. SpatTrack performs 2D particle tracking, trajectory analysis and fitting of diffusion models to the calculated mean square displacement. It allows for spatial analysis of detected vesicle patterns including calculation of the radial distribution function and particle-based colocalization. Importantly, all analysis tools are supported by Monte Carlo simulations of synthetic images. This allows the user to assess the reliability of the analysis and to study alternative scenarios. We demonstrate the functionality of SpatTrack by performing a detailed imaging study of internalized fluorescence-tagged Niemann Pick C2 (NPC2) protein in human disease fibroblasts. Using SpatTrack, we show that NPC2 rescued the cholesterol-storage phenotype from a subpopulation of late endosomes/lysosomes (LE/LYSs). This was paralleled by repositioning and active transport of NPC2-containing vesicles to the cell surface. The potential of SpatTrack for other applications in intracellular transport studies will be discussed. PMID:25243614

Lund, Frederik W; Jensen, Maria Louise V; Christensen, Tanja; Nielsen, Gitte K; Heegaard, Christian W; Wüstner, Daniel

2014-12-01

360

Validation results of the IAG Dancer project for distributed GPS analysis

NASA Astrophysics Data System (ADS)

The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and demonstrates the obvious benefits of distributed analysis of geodetic data in general. IAG Dancer screenshot

Boomkamp, H.

2012-12-01

361

We report on the existing connection between power-law distributions and allometries. As it was first reported in [PLoS ONE 7, e40393 (2012)] for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labour, illiteracy, income, sanitation and unemployment). Our analysis reveal that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residua...

Alves, Luiz G A; Lenzi, Ervin K; Mendes, Renio S

2014-01-01

362

NASA Astrophysics Data System (ADS)

We investigate the distribution of work performed on a Brownian particle in a time-dependent asymmetric potential well. The potential has a harmonic component with a time-dependent force constant and a time-independent logarithmic barrier at the origin. For an arbitrary driving protocol, the problem of solving the Fokker-Planck equation for the joint probability density of work and particle position is reduced to the solution of the Riccati differential equation. For a particular choice of the driving protocol, an exact solution of the Riccati equation is presented. An asymptotic analysis of the resulting expression yields the tail behavior of the work distribution for small and large work values. In the limit of a vanishing logarithmic barrier, the work distribution for the breathing parabola model is obtained.

Ryabov, Artem; Dierl, Marcel; Chvosta, Petr; Einax, Mario; Maass, Philipp

2013-02-01

363

Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants

This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.

Simion, G.P. [Science Applications International Corp., Albuquerque, NM (United States); VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Bulmahn, K.D. [SCIENTECH, Inc., Idaho Falls, ID (United States)

1993-06-01

364

Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

2013-01-01

365

Passive-scheme analysis for solving the untrusted source problem in quantum key distribution

As a practical method, the passive scheme is useful to monitor the photon statistics of an untrusted source in a 'Plug and Play' quantum key distribution (QKD) system. In a passive scheme, three kinds of monitor mode can be adopted: average photon number (APN) monitor, photon number analyzer (PNA), and photon number distribution (PND) monitor. In this paper, the security analysis is rigorously given for the APN monitor, while for the PNA, the analysis, including statistical fluctuation and random noise, is addressed with a confidence level. The results show that the PNA can achieve better performance than the APN monitor and can asymptotically approach the theoretical limit of the PND monitor. Also, the passive scheme with the PNA works efficiently when the signal-to-noise ratio (R{sup SN}) is not too low and so is highly applicable to solve the untrusted source problem in the QKD system.

Peng Xiang; Xu Bingjie; Guo Hong [CREAM Group, State Key Laboratory of Advanced Optical Communication Systems and Networks (Peking University) and Institute of Quantum Electronics, School of Electronics Engineering and Computer Science, Peking University, Beijing 100871 (China)

2010-04-15

366

Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

Quinlan, D; Barany, G; Panas, T

2007-08-30

367

Some physics and system issues in the security analysis of quantum key distribution protocols

NASA Astrophysics Data System (ADS)

In this paper, we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks, which are not accounted for in the security analysis and proofs. Hence, the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

Yuen, Horace P.

2014-10-01

368

Some Physics And System Issues In The Security Analysis Of Quantum Key Distribution Protocols

In this paper we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks which are not accounted for in the security analysis and proofs. Hence the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

Horace P. Yuen

2014-05-07

369

Single-phase power distribution system power flow and fault analysis

NASA Technical Reports Server (NTRS)

Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

Halpin, S. M.; Grigsby, L. L.

1992-01-01

370

Degradation by Joule heating is one of the important issues for realizing systems on a panel. We have investigated the thermal distribution of low-temperature polycrystalline silicon (poly-Si) p-channel thin-film transistors (TFTs) using an infrared imaging system. In order to design optimum circuits in which Joule heating is taken into account, a systematic analysis of Joule heating focusing on gate size

Shinichiro Hashimoto; Yukiharu Uraoka; Takashi Fuyuki; Yukihiro Morita

2006-01-01

371

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Cyrus Cooper; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; David Hadley; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Johanna Kuusisto; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Anna Nordstrom; Peter Nordstrom; Ben A. Oostra; Colin N. A. Palmer; John F. Peden; Inga Prokopenko; Frida Renstrom; Aimo Ruokonen; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Heather M. Stringham; Amy J. Swift; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield

2009-01-01

372

Stability analysis of a discrete Hutchinson equation with discrete and distributed delay

NASA Astrophysics Data System (ADS)

In this paper a Hutchinson equation with discrete and distributed delay is discretized by the Euler method. The dynamics of the obtained discrete system is then investigated. Specifically the stability of the positive fixed point is analyzed. It is found that for sufficiently small time-step of integration, the positive equilibrium undergoes a Neimark-Sacker bifurcation which is controlled by the discrete time delay. The results of analysis are then confirmed by some numerical simulations.

Suryanto, A.; Yanti, I.; Kusumawinahyu, W. M.

2014-02-01

373

Our purpose in this study was to implement three-dimensional (3D) gamma analysis for structures of interest such as the planning target volume (PTV) or clinical target volume (CTV), and organs at risk (OARs) for intensity-modulated radiation therapy (IMRT) dose verification. IMRT dose distributions for prostate and head and neck (HN) cancer patients were calculated with an analytical anisotropic algorithm in an Eclipse (Varian Medical Systems) treatment planning system (TPS) and by Monte Carlo (MC) simulation. The MC dose distributions were calculated with EGSnrc/BEAMnrc and DOSXYZnrc user codes under conditions identical to those for the TPS. The prescribed doses were 76 Gy/38 fractions with five-field IMRT for the prostate and 33 Gy/17 fractions with seven-field IMRT for the HN. TPS dose distributions were verified by the gamma passing rates for the whole calculated volume, PTV or CTV, and OARs by use of 3D gamma analysis with reference to MC dose distributions. The acceptance criteria for the 3D gamma analysis were 3/3 and 2 %/2 mm for a dose difference and a distance to agreement. The gamma passing rates in PTV and OARs for the prostate IMRT plan were close to 100 %. For the HN IMRT plan, the passing rates of 2 %/2 mm in CTV and OARs were substantially lower because inhomogeneous tissues such as bone and air in the HN are included in the calculation area. 3D gamma analysis for individual structures is useful for IMRT dose verification. PMID:24796955

Tomiyama, Yuuki; Araki, Fujio; Oono, Takeshi; Hioki, Kazunari

2014-07-01

374

A thermal conductivity model of nanofluids based on particle size distribution analysis

NASA Astrophysics Data System (ADS)

A model for predicting the thermal conductivity of nanofluids is proposed, in which the influence of nanoparticle clusters has been considered through the analysis of particle size distribution (PSD). The relation between PSD and thermal conductivity of nanofluids is established under the assumption that the nanoparticles in clusters aggregate with each other closely. A good agreement is achieved between the present model predictions and the experimental results in the literature.

Zhou, Dengqing; Wu, Huiying

2014-08-01

375

To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and\\/or WHR) was strong and disproportionate to that for

Cecilia M. Lindgren; Iris M. Heid; Joshua C. Randall; Claudia Lamina; Valgerdur Steinthorsdottir; Lu Qi; Elizabeth K. Speliotes; Gudmar Thorleifsson; Cristen J. Willer; Blanca M. Herrera; Anne U. Jackson; Noha Lim; Paul Scheet; Nicole Soranzo; Najaf Amin; Yurii S. Aulchenko; John C. Chambers; Alexander Drong; Jianan Luan; Helen N. Lyon; Fernando Rivadeneira; Serena Sanna; Nicholas J. Timpson; M. Carola Zillikens; Jing Hua Zhao; Peter Almgren; Stefania Bandinelli; Amanda J. Bennett; Richard N. Bergman; Lori L. Bonnycastle; Suzannah J. Bumpstead; Stephen J. Chanock; Lynn Cherkas; Peter Chines; Lachlan Coin; Cyrus Cooper; Gabriel Crawford; Angela Doering; Anna Dominiczak; Alex S. F. Doney; Shah Ebrahim; Paul Elliott; Michael R. Erdos; Karol Estrada; Luigi Ferrucci; Guido Fischer; Nita G. Forouhi; Christian Gieger; Harald Grallert; Christopher J. Groves; Scott Grundy; Candace Guiducci; David Hadley; Anders Hamsten; Aki S. Havulinna; Albert Hofman; Rolf Holle; John W. Holloway; Thomas Illig; Bo Isomaa; Leonie C. Jacobs; Karen Jameson; Pekka Jousilahti; Fredrik Karpe; Johanna Kuusisto; Jaana Laitinen; G. Mark Lathrop; Debbie A. Lawlor; Massimo Mangino; Wendy L. McArdle; Thomas Meitinger; Mario A. Morken; Andrew P. Morris; Patricia Munroe; Narisu Narisu; Anna Nordström; Peter Nordström; Ben A. Oostra; Colin N. A. Palmer; Felicity Payne; John F. Peden; Inga Prokopenko; Frida Renström; Aimo Ruokonen; Veikko Salomaa; Manjinder S. Sandhu; Laura J. Scott; Angelo Scuteri; Kaisa Silander; Kijoung Song; Xin Yuan; Heather M. Stringham; Amy J. Swift; Tiinamaija Tuomi; Manuela Uda; Peter Vollenweider; Gerard Waeber; Chris Wallace; G. Bragi Walters; Michael N. Weedon; Jacqueline C. M. Witteman; Cuilin Zhang; Weihua Zhang; Mark J. Caulfield; Francis S. Collins; George Davey Smith; Ian N. M. Day; Paul W. Franks; Andrew T. Hattersley; Frank B. Hu; Marjo-Riitta Jarvelin; Augustine Kong; Jaspal S. Kooner; Markku Laakso; Edward Lakatta; Vincent Mooser; Andrew D. Morris; Leena Peltonen; Nilesh J. Samani; Timothy D. Spector; David P. Strachan; Toshiko Tanaka; Jaakko Tuomilehto; André G. Uitterlinden; Cornelia M. van Duijn; Nicholas J. Wareham; Hugh Watkins for the PROCARDIS consortia; Dawn M. Waterworth; Michael Boehnke; Panos Deloukas; Leif Groop; David J. Hunter; Unnur Thorsteinsdottir; David Schlessinger; H.-Erich Wichmann; Timothy M. Frayling; Gonçalo R. Abecasis; Joel N. Hirschhorn; Ruth J. F. Loos; Kari Stefansson; Karen L. Mohlke; Inês Barroso

2009-01-01

376

Various strengthening techniques for structural elements using different materials have been investigated. Recently, a new, reliable and cost-effective strengthening technique with distributed prestressed high strength steel wire rope (P-SWR technique) was proposed. This paper mainly focuses on theoretical analysis of the flexural behaviour of reinforced concrete (RC) beams strengthened with the P-SWR strengthening technique. First, mechanical properties of steel wire

Gang Wu; Zhishen Wu; Yang Wei; Jianbiao Jiang; Yi Cui

2012-01-01

377

Time-cost analysis of a quantum key distribution system clocked at 100 MHz.

We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed. PMID:21935140

Mo, X F; Lucio-Martinez, I; Chan, P; Healey, C; Hosier, S; Tittel, W

2011-08-29

378

Time-cost analysis of a quantum key distribution system clocked at 100 MHz

We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed.

Mo, Xiaofan; Chan, Philip; Healey, Chris; Hosier, Steve; Tittel, Wolfgang

2011-01-01

379

Time-cost analysis of a quantum key distribution system clocked at 100 MHz

We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed.

Xiaofan Mo; Itzel Lucio Martinez; Philip Chan; Chris Healey; Steve Hosier; Wolfgang Tittel

2011-05-18

380

Analysis of detector performance in a gigahertz clock rate quantum key distribution system

We present a detailed analysis of a gigahertz clock rate environmentally robust phase-encoded quantum key distribution (QKD) system utilizing several different single-photon detectors, including the first implementation of an experimental resonant cavity thin-junction silicon single-photon avalanche diode. The system operates at a wavelength of 850 nm using standard telecommunications optical fibre. A general-purpose theoretical model for the performance of QKD

Patrick J. Clarke; Robert J. Collins; Philip A. Hiskett; María-José García-Martínez; Nils J. Krichel; Aongus McCarthy; Michael G. Tanner; John A. O'Connor; Chandra M. Natarajan; Shigehito Miki; Masahide Sasaki; Zhen Wang; Mikio Fujiwara; Ivan Rech; Massimo Ghioni; Angelo Gulinatti; Robert H. Hadfield; Paul D. Townsend; Gerald S. Buller

2011-01-01

381

Time-cost analysis of a quantum key distribution system clocked at 100 MHz

We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability

X. F. Mo; I. Lucio-Martinez; P. Chan; C. Healey; S. Hosier; W. Tittel

2011-01-01

382

A formal approach to fault tree synthesis for the analysis of distributed fault tolerant systems

Designing cost-sensitive real-time control systems for safety-critical applications requires a careful analysis of both performance versus cost aspects and fault coverage of fault tolerant solutions. This further complicates the difficult task of deploying the embedded software that implements the control algorithms on a possibly distributed execution platform (for instance in automotive applications). In this paper, we present a novel technique

Mark L. McKelvin Jr.; Gabriel Eirea; Claudio Pinello; Sri Kanajan; Alberto L. Sangiovanni-Vincentelli

2005-01-01

383

Fluorescence correlation spectroscopy (FCS) has proven to be a powerful technique with single-molecule sensitivity. Recently, it has found a complement in the form of fluorescence intensity distribution analysis (FIDA). Here we introduce a fluorescence fluctuation method that combines the features of both techniques. It is based on the global analysis of a set of photon count number histograms, recorded with multiple widths of counting time intervals simultaneously. This fluorescence intensity multiple distributions analysis (FIMDA) distinguishes fluorescent species on the basis of both the specific molecular brightness and the translational diffusion time. The combined information, extracted from a single measurement, increases the readout effectively by one dimension and thus breaks the individual limits of FCS and FIDA. In this paper a theory is introduced that describes the dependence of photon count number distributions on diffusion coefficients. The theory is applied to a series of photon count number histograms corresponding to different widths of counting time intervals. Although the ability of the method to determine specific brightness values, diffusion times, and concentrations from mixtures is demonstrated on simulated data, its experimental utilization is shown by the determination of the binding constant of a protein-ligand interaction exemplifying its broad applicability in the life sciences. PMID:11106594

Palo, K; Mets, U; Jager, S; Kask, P; Gall, K

2000-01-01

384

Space station electrical power distribution analysis using a load flow approach

NASA Technical Reports Server (NTRS)

The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

Emanuel, Ervin M.

1987-01-01

385

NASA Astrophysics Data System (ADS)

In this paper a procedure to derive Flood Design Hydrographs (FDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) using copulas, which describe and model the correlation between these two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model is presented. Rainfall-runoff modelling for estimating the hydrological response at the outlet of a watershed used a conceptual fully distributed procedure based on the soil conservation service - curve number method as excess rainfall model and a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the definition of a distributed unit hydrograph, has been performed, implementing a procedure using flow paths determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the return period of the FDH which give the probability of occurrence of a hydrograph flood peaks and flow volumes obtained through R-R modeling has been statistically treated via copulas. The shape of hydrograph has been generated on the basis of a modeled flood events, via cluster analysis. The procedure described above was applied to a case study of Imera catchment in Sicily, Italy. The methodology allows a reliable and estimation of the Design Flood Hydrograph and can be used for all the flood risk applications, i.e. evaluation, management, mitigation, etc.

Candela, A.; Brigandí, G.; Aronica, G. T.

2014-01-01

386

NASA Astrophysics Data System (ADS)

have developed a new analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow intensity image data to study atmospheric gravity waves. This method can deal with extensive amounts of imaging data obtained on different years and at various observation sites without bias caused by different event extraction criteria for the person processing the data. The new method was applied to sodium airglow data obtained in 2011 at Syowa Station (69°S, 40°E), Antarctica. The results were compared with those obtained from a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics, such as wavelengths, phase velocities, and wave periods. The horizontal phase velocity of each wave event in the airglow images corresponded closely to a peak in the spectrum. The statistical results of spectral analysis showed an eastward offset of the horizontal phase velocity distribution. This could be interpreted as the existence of wave sources around the stratospheric eastward jet. Similar zonal anisotropy was also seen in the horizontal phase velocity distribution of the gravity waves by the event analysis. Both methods produce similar statistical results about directionality of atmospheric gravity waves. Galactic contamination of the spectrum was examined by calculating the apparent velocity of the stars and found to be limited for phase speeds lower than 30 m/s. In conclusion, our new method is suitable for deriving the horizontal phase velocity characteristics of atmospheric gravity waves from an extensive amount of imaging data.

Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Shiokawa, Kazuo

2014-08-01

387

Investigation of the distribution of minerals in coals by normative analysis

The distribution of minerals in 86 coals have been calculated from chemical analyses of high temperature ash (normative analysis) and the results compared with mineralogical analyses based on x-ray diffraction and IR spectrometry of low temperature ash. Our general conclusion is that, to the extent we can determine at present, the normative analysis does give a useful semiquantitative analysis of the minerals in coals - better than one might predict from the nature of the procedure. It could be used, and will be in the near future, to ascertain whether, for example, the distributions of minerals in coals formed in different provinces and basins show gross differences as a result of differing geology and geochemistry. It might also be useful in assessing whether these differences are important in the conversion of coals to gaseous and liquid fuels. We should point out that we do not consider normative analysis at all applicable to lignites or subbituminous coals, since with lignites, 30 to 50% or more of ash-forming constituents consist of cations attached to the organic matter in ion-exchangeable form on carboxyl groups or as chelate complexes; a similar, though less marked, situation obtains with subbituminous coals.

Given, P.H.; Weldon, D.; Suhr, N.

1980-01-01

388

Background Phytochromes are photoreceptors, discovered in plants, that control a wide variety of developmental processes. They have also been found in bacteria and fungi, but for many species their biological role remains obscure. This work concentrates on the phytochrome system of Agrobacterium tumefaciens, a non-photosynthetic soil bacterium with two phytochromes. To identify proteins that might share common functions with phytochromes, a co-distribution analysis was performed on the basis of protein sequences from 138 bacteria. Results A database of protein sequences from 138 bacteria was generated. Each sequence was BLASTed against the entire database. The homolog distribution of each query protein was then compared with the homolog distribution of every other protein (target protein) of the same species, and the target proteins were sorted according to their probability of co-distribution under random conditions. As query proteins, phytochromes from Agrobacterium tumefaciens, Pseudomonas aeruginosa, Deinococcus radiodurans and Synechocystis PCC 6803 were chosen along with several phytochrome-related proteins from A. tumefaciens. The Synechocystis photosynthesis protein D1 was selected as a control. In the D1 analyses, the ratio between photosynthesis-related proteins and those not related to photosynthesis among the top 150 in the co-distribution tables was > 3:1, showing that the method is appropriate for finding partner proteins with common functions. The co-distribution of phytochromes with other histidine kinases was remarkably high, although most co-distributed histidine kinases were not direct BLAST homologs of the query protein. This finding implies that phytochromes and other histidine kinases share common functions as parts of signalling networks. All phytochromes tested, with one exception, also revealed a remarkably high co-distribution with glutamate synthase and methionine synthase. This result implies a general role of bacterial phytochromes in ammonium assimilation and amino acid metabolism. Conclusion It was possible to identify several proteins that might share common functions with bacterial phytochromes by the co-distribution approach. This computational approach might also be helpful in other cases. PMID:16539742

Lamparter, Tilman

2006-01-01

389

Distributions of diffusion measures from a local mean-square displacement analysis

NASA Astrophysics Data System (ADS)

In cell biology, time-resolved fluctuation analysis of tracer particles has recently gained great importance. One such method is the local mean-square displacement (MSD) analysis, which provides an estimate of two parameters as functions of time: the exponent of growth of the MSD and the diffusion coefficient. Here, we study the joint and marginal distributions of these parameters for Brownian motion with Gaussian velocity fluctuations, including the cases of vanishing correlations (overdamped Brownian motion) and of a finite negative velocity correlation (as observed in intracellular motion). Numerically, we demonstrate that a small number of MSD points is optimal for the estimation of the diffusion measures. Motivated by this observation, we derive an analytic approximation for the joint and marginal probability densities of the exponent and diffusion coefficient for the special case of two MSD points. These analytical results show good agreement with numerical simulations for sufficiently large window sizes. Our results might promote better statistical analysis of intracellular motility.

Nandi, Amitabha; Heinrich, Doris; Lindner, Benjamin

2012-08-01

390

Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings

A probabilistic analysis, using the two-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that

Fred B. Oswald; Timothy R. Jett; Roamer E. Predmore; Erwin V. Zaretsky

2008-01-01

391

NASA Astrophysics Data System (ADS)

Distributed hydrological models enhance the analysis and explanation of environmental processes. As more spatial input data and time series become available, more analysis is required of the sensitivity of the data on the simulations. Most research so far focussed on the sensitivity of precipitation data in distributed hydrological models. However, these results can not be compared until a universal approach to quantify the sensitivity of a model to spatial data is available. The frequently tested and used remote sensing data for distributed models is snow cover. Snow cover fraction (SCF) remote sensing products are easily available from the internet, e.g. MODIS snow cover product MOD10A1 (daily snow cover fraction at 500m spatial resolution). In this work a spatial sensitivity analysis (SA) of remotely sensed SCF from MOD10A1 was conducted with the distributed WetSpa model. The aim is to investigate if the WetSpa model is differently subjected to SCF uncertainty in different areas of the model domain. The analysis was extended to look not only at SA quantities but also to relate them to the physical parameters and processes in the study area. The study area is the Biebrza River catchment, Poland, which is considered semi natural catchment and subject to a spring snow melt regime. Hydrological simulations are performed with the distributed WetSpa model, with a simulation period of 2 hydrological years. For the SA the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm is used, with a set of different response functions in regular 4 x 4 km grid. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different landscape features. Moreover, the spatial patterns of the SA results are related to the WetSpa spatial parameters and to different physical processes. Based on the study results, it is clear that spatial approach of SA can be performed with the proposed algorithm and the MOD10A1 SCF is spatially sensitive in the WetSpa model.

Berezowski, Tomasz; Chorma?ski, Jaros?aw; Nossent, Jiri; Batelaan, Okke

2014-05-01

392

Application of digital image analysis for size distribution measurements of microbubbles

This work employs digital image analysis to measure the size distribution of microbubbles generated by the process of electroflotation for use in solid/liquid separation processes. Microbubbles are used for separations in the mineral processing industry and also in the treatment of potable water and wastewater.As the bubbles move upward in a solid/liquid column due to buoyancy, particles collide with and attach to the bubbles and are carried to the surface of the column where they are removed by skimming. The removal efficiency of solids is strongly affected by the size of the bubbles. In general, higher separation is achieved by a smaller bubble size. The primary focus of this study was to characterize the size and size distribution of bubbles generated in electroflotation using image analysis. The study found that bubble diameter increased slightly as the current density applied to the system was increased. Additionally, electroflotation produces a uniform bubble size with narrow distribution which optimizes the removal of fine particles from solution.

Burns, S.E.; Yiacoumi, S.; Frost, D. [Georgia Inst. of Tech., Atlanta, GA (United States). School of Civil and Environmental Engineering; Tsouris, C. [Oak Ridge National Lab., TN (United States). Chemical Technology Div.

1997-03-01

393

Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

NASA Technical Reports Server (NTRS)

The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

Mcguire, Stephen C.

1987-01-01

394

NASA Astrophysics Data System (ADS)

We report on the existing connection between power-law distributions and allometries. As it was first reported in Gomez-Lievano et al. (2012) for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labor, illiteracy, income, sanitation and unemployment). Our analysis reveals that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residuals fluctuations surrounding the allometries are characterized by an almost constant variance and log-normal distributions.

Alves, L. G. A.; Ribeiro, H. V.; Lenzi, E. K.; Mendes, R. S.

2014-09-01

395

Modeling call holding time distributions for CCS network design and performance analysis

NASA Astrophysics Data System (ADS)

The message traffic offered to the CCS signalling network depends on and is modulated by the traffic characteristics of the circuit switched calls supported by the CCS network. Most previous analyses of CCS network engineering, performance evaluation and congestion control protocols generally assume an exponential holding time of circuit switched calls. Analysis of actual holding time distributions in conversations, facsimile and voice mail connections revealed that these distributions radically differ from the exponential distribution. Especially significant is the large proportion of very short calls in real traffic in comparison with the exponential distribution model. The diversity of calls (partial dialing, subscriber busy, no answer) and services results in a multi-component call mix, with even larger proportion of short time intervals between message-generating events. Very short call holding times can have a significant impact on the traffic stream presented to the CCS network: for calls with short holding times, the different CCS messages arrive relatively close to each other, and this manifests as burstiness in the CCS traffic stream.

Bolotin, Vladimir A.

1994-04-01

396

NASA Astrophysics Data System (ADS)

Burning of candles generates particulate matter of fine dimensions that produces poor indoor air quality, so it may cause harmful impact on human health. In this study solid aerosol particles of burning of candles of different composition and kerosene combustion were collected in a closed laboratory system. Present work describes particulate matter collection for structure analysis and the relationship between source and size distribution of particulate matter. The formation mechanism of particulate matter and their tendency to agglomerate also are described. Particles obtained from kerosene combustion have normal size distribution. Whereas, particles generated from the burning of stearin candles have distribution shifted towards finer particle size range. If an additive of stearin to paraffin candle is used, particle size distribution is also observed in range of towards finer particles. A tendency to form agglomerates in a short time is observed in case of particles obtained from kerosene combustion, while in case of particles obtained from burning of candles of different composition such a tendency is not observed. Particles from candles and kerosene combustion are Aitken and accumulation mode particles

Baitimirova, M.; Osite, A.; Katkevics, J.; Viksna, A.

2012-08-01

397

Further Progress Applying the Generalized Wigner Distribution to Analysis of Vicinal Surfaces

NASA Astrophysics Data System (ADS)

Terrace width distributions (TWDs) can be well fit by the generalized Wigner distribution (GWD), generally better than by conventional Gaussians, and thus offers a convenient way to estimate the dimensionless elastic repulsion strength tildeA from ?^2, the TWD variance.(T.L. Einstein and O. Pierre-Louis, Surface Sci. 424), L299 (1999) The GWD ?^2 accurately reproduces values for the two exactly soluble cases at small tildeA and in the asymptotic limit. Taxing numerical simulations show that the GWD ?^2 interpolates well between these limits. Extensive applications have been made to experimental data, esp. on Cu.(M. Giesen and T.L. Einstein, Surface Sci. 449), 191 (2000) Recommended analysis procedures are catalogued.(H.L. Richards, S.D. Cohen, TLE, & M. Giesen, Surf Sci 453), 59 (2000) Extensions of the GWD for multistep distributions are tested, with good agreement for second-neighbor distributions, less good for third.(TLE, HLR, SDC, & OP-L, Proc ISSI-PDSC2000, cond-mat/0012xxxxx) Alternatively, step-step correlation functions, about which there is more theoretical information, should be measured.

Einstein, T. L.; Richards, Howard L.; Cohen, S. D.

2001-03-01

398

NASA Astrophysics Data System (ADS)

In this study, an algorithm was developed to measure the distribution of pulmonary time constants (TCs) from dynamic computed tomography (CT) data sets during a sudden airway pressure step up. Simulations with synthetic data were performed to test the methodology as well as the influence of experimental noise. Furthermore the algorithm was applied to in vivo data. In five pigs sudden changes in airway pressure were imposed during dynamic CT acquisition in healthy lungs and in a saline lavage ARDS model. The fractional gas content in the imaged slice (FGC) was calculated by density measurements for each CT image. Temporal variations of the FGC were analysed assuming a model with a continuous distribution of exponentially decaying time constants. The simulations proved the feasibility of the method. The influence of experimental noise could be well evaluated. Analysis of the in vivo data showed that in healthy lungs ventilation processes can be more likely characterized by discrete TCs whereas in ARDS lungs continuous distributions of TCs are observed. The temporal behaviour of lung inflation and deflation can be characterized objectively using the described new methodology. This study indicates that continuous distributions of TCs reflect lung ventilation mechanics more accurately compared to discrete TCs.

Doebrich, Marcus; Markstaller, Klaus; Karmrodt, Jens; Kauczor, Hans-Ulrich; Eberle, Balthasar; Weiler, Norbert; Thelen, Manfred; Schreiber, Wolfgang G.

2005-04-01

399

Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

NASA Technical Reports Server (NTRS)

Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

Yoo, Paul

2013-01-01

400

Analysis of the Efficiency Improvement in Small Wind Turbines when Speed Is Controlled

This paper analyzes the efficiency of three small commercial wind turbines, from different manufacturers and with different technology. Their actual energy production is compared with the case where they use turbine speed control to follow the wind speed variations. The simulation uses a Weibull probability distribution for the wind speed. A fixed pitch turbine with permanent magnets generator is simulated

F. Martínez Rodrigo; L. C. Herrero de Lucas; Santiago de Pablo Gómez; J. M. G. de la Fuente

2007-01-01

401

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and designed to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three-dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

Wilhelmson, Robert; Koch, Steven

1993-01-01

402

A distributed analysis and visualization system for model and observational data

NASA Technical Reports Server (NTRS)

The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

Wilhelmson, Robert; Koch, Steven

1992-01-01

403

Ion energy distribution analysis of the TVA plasma ignited in carbon vapours using RFA

NASA Astrophysics Data System (ADS)

In order to understand plasma processes and to obtain technological control in thin film deposition, the study of surface-plasma interactions is essential. Apart from the type and flux of the impinging ions/neutral atoms on the surface, the ion energy distribution (IED) is an important parameter in understanding surface modification due to the plasma. In this paper, results of ion energy analysis of the Thermionic Vacuum Arc (TVA) plasma ignited in carbon vapours are presented. An in-house, computer-controlled retarding field analyzer was used for determining experimentally ion energy distributions of the carbon ions arriving at the substrate. The correlation of the carbon IED with the applied arc voltage in the TVA plasma was put in evidence for the first time.

Surdu-Bob, C. C.; Badulescu, M.; Iacob, C.; Porosnicu, C.; Lungu, C. P.

2010-01-01

404

Phylogenetic Analysis and Comparative Genomics of Purine Riboswitch Distribution in Prokaryotes

Riboswitches are regulatory RNA that control gene expression by undergoing conformational changes on ligand binding. Using phylogenetic analysis and comparative genomics we have been able to identify the class of genes/operons regulated by the purine riboswitch and obtain a high-resolution map of purine riboswitch distribution across all bacterial groups. In the process, we are able to explain the absence of purine riboswitches upstream to specific genes in certain genomes. We also identify the point of origin of various purine riboswitches and argue that not all purine riboswitches are of primordial origin, and that some purine riboswitches must have originated after the divergence of certain Firmicute orders in the course of evolution. Our study also reveals the role of horizontal transfer events in accounting for the presence of purine riboswitches in some gammaproteobacterial species. Our work provides significant insights into the origin, distribution and regulatory role of purine riboswitches in prokaryotes. PMID:23170063

Singh, Payal; Sengupta, Supratim

2012-01-01

405

Theoretical analysis of the ice crystal size distribution in frozen aqueous specimens.

To estimate theoretically how suited different freezing techniques are for freezing of freeze-etch specimens, it is necessary to know the relationship between specimen cooling rate and the resulting average ice crystal size. Using a somewhat simplified theoretical analysis, we have derived the approximate ice crystal size distribution of nonvitrified frozen aqueous specimens frozen at different cooling rates. The derived size distribution was used to calculate the relationship between relative change in average ice crystal size, (delta l/l), and relative change in specimen cooling rate delta (dT/dt)/(dT/dt). We found this relationship to be (delta l/l) = -k X delta (dT/dt)/(dT/dt) where k = 1.0 when specimen solidification takes place at about -6 degrees C, and k congruent to 1.3 when it takes place at about -40 degrees C. Images FIGURE 6 PMID:7171711

Kopstad, G; Elgsaeter, A

1982-01-01

406

NASA Technical Reports Server (NTRS)

Simulation and analysis results are described for a wideband fiber optic intermediate frequency distribution channel for a frequency division multiple access (FDMA) system where antenna equipment is remotely located from the signal processing equipment. The fiber optic distribution channel accommodates multiple signals received from a single antenna with differing power levels. Performance parameters addressed are intermodulation degradations, laser noise, and adjacent channel interference, as they impact the overall system design. Simulation results showed that the laser diode modulation level can be allowed to reach 100 percent without considerable degradation. The laser noise must be controlled as to provide a noise floor of less than -90 dBW/Hz. The fiber optic link increases the degradation due to power imbalance yet diminishes the effects of the transmit amplifier nonlinearity. Overall, optimal operation conditions can be found to yield a degradation level of about .1 dB caused by the fiber optic link.

Costello, Thomas A.; Brandt, C. Maite

1989-01-01

407

Distributed and/or grid-oriented approach to BTeV data analysis

The BTeV collaboration will record approximately 2 petabytes of raw data per year. It plans to analyze this data using the distributed resources of the collaboration as well as dedicated resources, primarily residing in the very large BTeV trigger farm, and resources accessible through the developing world-wide data grid. The data analysis system is being designed from the very start with this approach in mind. In particular, we plan a fully disk-based data storage system with multiple copies of the data distributed across the collaboration to provide redundancy and to optimize access. We will also position ourself to take maximum advantage of shared systems, as well as dedicated systems, at our collaborating institutions.

Joel N. Butler

2002-12-23

408

Flavor stability analysis of dense supernova neutrinos with flavor-dependent angular distributions

Numerical simulations of the supernova (SN) neutrino self-induced flavor conversions, associated with the neutrino-neutrino interactions in the deepest stellar regions, have been typically carried out assuming the "bulb-model". In this approximation, neutrinos are taken to be emitted half-isotropically by a common neutrinosphere. In the recent Ref. \\cite{Mirizzi:2011tu} we have removed this assumption by introducing flavor-dependent angular distributions for SN neutrinos, as suggested by core-collapse simulations. We have found that in this case a novel multi-angle instability in the self-induced flavor transitions can arise. In this work we perform an extensive study of this effect, carrying out a linearized flavor stability analysis for different SN neutrino energy fluxes and angular distributions, in both normal and inverted neutrino mass hierarchy. We confirm that spectra of different nu species which cross in angular space (where F_{\

Alessandro Mirizzi; Pasquale Dario Serpico

2012-08-01

409

Preliminary analysis of the span-distributed-load concept for cargo aircraft design

NASA Technical Reports Server (NTRS)

A simplified computer analysis of the span-distributed-load airplane (in which payload is placed within the wing structure) has shown that the span-distributed-load concept has high potential for application to future air cargo transport design. Significant increases in payload fraction over current wide-bodied freighters are shown for gross weights in excess of 0.5 Gg (1,000,000 lb). A cruise-matching calculation shows that the trend toward higher aspect ratio improves overall efficiency; that is, less thrust and fuel are required. The optimal aspect ratio probably is not determined by structural limitations. Terminal-area constraints and increasing design-payload density, however, tend to limit aspect ratio.

Whitehead, A. H., Jr.

1975-01-01

410

Unbiased analysis of CLEO data at NLO and the pion distribution amplitude

NASA Astrophysics Data System (ADS)

We discuss different QCD approaches to calculate the form factor F?*??(Q2) of the ?*???0 transition giving preference to the light-cone QCD sum rules (LCSR) approach as being the most adequate. In this context we revise the previous analysis of the CLEO experimental data on F?*??(Q2) by Schmedding and Yakovlev. Special attention is paid to the sensitivity of the results to the (strong radiative) ?s corrections and to the value of the twist-four coupling ?2. We present a full analysis of the CLEO data at the NLO level of LCSRs, focusing particular attention on the extraction of the relevant parameters to determine the pion distribution amplitude, i.e., the Gegenbauer coefficients a2, a4. Our analysis confirms our previous results and also the main findings of Schmedding and Yakovlev: both the asymptotic as well as the Chernyak-Zhitnitsky pion distribution amplitudes are completely excluded by the CLEO data. A novelty of our approach is to use the CLEO data as a means of determining the value of the QCD vacuum nonlocality parameter ?2q=/

=0.4 GeV2, which specifies the average virtuality of the vacuum quarks.

Bakulev, Alexander P.; Mikhailov, S. V.; Stefanis, N. G.

2003-04-01

411

Constraints on color-octet fermions from a global parton distribution analysis.

We report a parton distribution function analysis of a complete set of hadron scattering data, in which a color-octet fermion (such as a gluino of supersymmetry) is incorporated as an extra parton constituent along with the usual standard model constituents. The data set includes the most up-to-date results from deep inelastic scattering and from jet production in hadron collisions. Another feature is the inclusion in the fit of data from determinations of the strong coupling {alpha}{sub s}(Q) at large and small values of the hard scale Q. Our motivation is to determine the extent to which the global parton distribution function analysis may provide constraints on the new fermion, as a function of its mass and {alpha}{sub s}(M{sub Z}), independent of assumptions such as the mechanism of gluino decays. Based on this analysis, we find that gluino masses as low as 30 to 50 GeV may be compatible with the current hadronic data. Gluino masses below 15 GeV (25 GeV) are excluded if {alpha}{sub s}(M{sub Z}) varies freely (is equal to 0.118). At the outset, stronger constraints had been anticipated from jet production cross sections, but experimental systematic uncertainties, particularly in normalization, reduce the discriminating power of these data.

Berger, E. L.; Guzzi, M.; Lai, H.-L.; Nadolsky, P. M.; Olness, F. I.; High Energy Physics; Southern Methodist Univ.; Taipei Municipal Univ. of Education

2010-01-01

412

X-ray fluorescence analysis of iron and manganese distribution in primary dopaminergic neurons

Transition metals have been suggested to play a pivotal role in the pathogenesis of Parkinson's disease. X-ray microscopy combined with a cryogenic setup is a powerful method for elemental imaging in low concentrations and high resolution in intact cells, eliminating the need for fixation and sectioning of the specimen. Here, we performed an elemental distribution analysis in cultured primary midbrain neurons with a step size in the order of 300 nm and ? 0.1 ppm sensitivity under cryo conditions by using X-ray fluorescence microscopy. We report the elemental mappings on the subcellular level in primary mouse dopaminergic (DAergic) and non-DAergic neurons after treatment with transition metals. Application of Fe2+ resulted in largely extracellular accumulation of iron without preference for the neuronal transmitter subtype. A quantification of different Fe oxidation states was performed using X-ray absorption near edge structure analysis. After treatment with Mn2+, a cytoplasmic/paranuclear localization of Mn was observed preferentially in DAergic neurons, while no prominent signal was detectable after Mn3+ treatment. Immunocytochemical analysis correlated the preferential Mn uptake to increased expression of voltage-gated calcium channels in DAergic neurons. We discuss the implications of this differential elemental distribution for the selective vulnerability of DAergic neurons and Parkinson's disease pathogenesis. PMID:23106162

Ducic, Tanja; Barski, Elisabeth; Salome, Murielle; Koch, Jan C; Bahr, Mathias; Lingor, Paul

2013-01-01

413

X-ray fluorescence analysis of iron and manganese distribution in primary dopaminergic neurons.

Transition metals have been suggested to play a pivotal role in the pathogenesis of Parkinson's disease. X-ray microscopy combined with a cryogenic setup is a powerful method for elemental imaging in low concentrations and high resolution in intact cells, eliminating the need for fixation and sectioning of the specimen. Here, we performed an elemental distribution analysis in cultured primary midbrain neurons with a step size in the order of 300 nm and ~ 0.1 ppm sensitivity under cryo conditions by using X-ray fluorescence microscopy. We report the elemental mappings on the subcellular level in primary mouse dopaminergic (DAergic) and non-DAergic neurons after treatment with transition metals. Application of Fe(2+) resulted in largely extracellular accumulation of iron without preference for the neuronal transmitter subtype. A quantification of different Fe oxidation states was performed using X-ray absorption near edge structure analysis. After treatment with Mn(2+) , a cytoplasmic/paranuclear localization of Mn was observed preferentially in DAergic neurons, while no prominent signal was detectable after Mn(3+) treatment. Immunocytochemical analysis correlated the preferential Mn uptake to increased expression of voltage-gated calcium channels in DAergic neurons. We discuss the implications of this differential elemental distribution for the selective vulnerability of DAergic neurons and Parkinson's disease pathogenesis. PMID:23106162

Du?i?, Tanja; Barski, Elisabeth; Salome, Murielle; Koch, Jan C; Bähr, Mathias; Lingor, Paul

2013-01-01

414

Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M1). The antagonistic crowns M1 and P2–M1 of two dried modern human skulls were scanned by ?CT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M1 and P2–M1 was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M1 in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth. PMID:21615398

Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

2011-01-01

415

NASA Astrophysics Data System (ADS)

Calibration and uncertainty analysis in hydrologic modeling are affected by measurement errors in input and response and errors in model structure. Recently, extending similar approaches in discrete time, a continuous time autoregressive error model was proposed for statistical inference and uncertainty analysis in hydrologic modeling. The major advantages over discrete time formulation are the use of a continuous time error model for describing continuous processes, the possibility of accounting for seasonal variations of parameters in the error model, the easier treatment of missing data or omitted outliers, and the opportunity for continuous time predictions. The model was developed for the Chaohe Basin in China and had some features specific for this semiarid climatic region (in particular, the seasonal variation of parameters in the error model in response to seasonal variation in precipitation). This paper tests and extends this approach with an application to the Thur River basin in Switzerland, which is subject to completely different climatic conditions. This application corroborates the general applicability of the approach but also demonstrates the necessity of accounting for the heavy tails in the distributions of residuals and innovations. This is done by replacing the normal distribution of the innovations by a Student t distribution, the degrees of freedom of which are adapted to best represent the shape of the empirical distribution of the innovations. We conclude that with this extension, the continuous time autoregressive error model is applicable and flexible for hydrologic modeling under different climatic conditions. The major remaining conceptual disadvantage is that this class of approaches does not lead to a separate identification of model input and model structural errors. The major practical disadvantage is the high computational demand characteristic for all Markov chain Monte Carlo techniques.

Yang, Jing; Reichert, Peter; Abbaspour, Karim C.

2007-10-01

416

Statistical Distribution of Inflation on Lava Flows: Analysis of Flow Surfaces on Earth and Mars

NASA Technical Reports Server (NTRS)

The surface morphology of a lava flow results from processes that take place during the emplacement of the flow. Certain types of features, such as tumuli, lava rises and lava rise pits, are indicators of flow inflation or endogenous growth of a lava flow. Tumuli in particular have been identified as possible indicators of tube location, indicating that their distribution on the surface of a lava flow is a junction of the internal pathways of lava present during flow emplacement. However, the distribution of tumuli on lava flows has not been examined in a statistically thorough manner. In order to more rigorously examine the distribution of tumuli on a lava flow, we examined a discrete flow lobe with numerous lava rises and tumuli on the 1969 - 1974 Mauna Ulu flow at Kilauea, Hawaii. The lobe is located in the distal portion of the flow below Holei Pali, which is characterized by hummocky pahoehoe flows emplaced from tubes. We chose this flow due to its discrete nature allowing complete mapping of surface morphologies, well-defined boundaries, well-constrained emplacement parameters, and known flow thicknesses. In addition, tube locations for this Mauna Ulu flow were mapped by Holcomb (1976) during flow emplacement. We also examine the distribution of tumuli on the distal portion of the hummocky Thrainsskjoldur flow field provided by Rossi and Gudmundsson (1996). Analysis of the Mauna Ulu and Thrainsskjoldur flow lobes and the availability of high-resolution MOC images motivated us to look for possible tumuli-dominated flow lobes on the surface of Mars. We identified a MOC image of a lava flow south of Elysium Mons with features morphologically similar to tumuli. The flow is characterized by raised elliptical to circular mounds, some with axial cracks, that are similar in size to the tumuli measured on Earth. One potential avenue of determining whether they are tumuli is to look at the spatial distribution to see if any patterns similar to those of tumuli-dominated terrestrial flows can be identified. Since tumuli form by the injection of lava beneath a crust, the distribution of tumuli on a flow should represent the distribution of thermally preferred pathways beneath the surface of the crust. That distribution of thermally preferred pathways may be a function of the evolution of a basaltic lava flow. As a longer-lived flow evolves, initially broad thermally preferred pathways would evolve to narrower, more well-defined tube-like pathways. The final flow morphology clearly preserves the growth of the flow over time, with inflation features indicating pathways that were not necessarily contemporaneously active. Here, we test using statistical analysis whether this final flow morphology produces distinct distributions that can be used to readily determine the distribution of thermally preferred pathways beneath the surface of the crust.

Glazel, L. S.; Anderson, S. W.; Stofan, E. R.; Baloga, S.

2003-01-01

417

A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil

This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909

Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Pellico Netto, Sylvio; Rodrigues, Aurelio Lourenco; Simon, Augusto Arlindo

2014-01-01

418

More than 220 large landslides along the bluffs bordering the Mississippi alluvial plain between Cairo, Ill., and Memphis, Tenn., are analyzed by discriminant analysis and multiple linear regression to determine the relative effects of slope height and steepness, stratigraphic variation, slope aspect, and proximity to the hypocenters of the 1811-12 New Madrid, Mo., earthquakes on the distribution of these landslides. Three types of landslides are analyzed: (1) old, coherent slumps and block slides, which have eroded and revegetated features and no active analogs in the area; (2) old earth flows, which are also eroded and revegetated; and (3) young rotational slumps, which are present only along near-river bluffs, and which are the only young, active landslides in the area. Discriminant analysis shows that only one characteristic differs significantly between bluffs with and without young rotational slumps: failed bluffs tend to have sand and clay at their base, which may render them more susceptible to fluvial erosion. Bluffs having old coherent slides are significantly higher, steeper, and closer to the hypocenters of the 1811-12 earthquakes than bluffs without these slides. Bluffs having old earth flows are likewise higher and closer to the earthquake hypocenters. Multiple regression analysis indicates that the distribution of young rotational slumps is affected most strongly by slope steepness: about one-third of the variation in the distribution is explained by variations in slope steepness. The distribution of old coherent slides and earth flows is affected most strongly by slope height, but the proximity to the hypocenters of the 1811-12 earthquakes also significantly affects the distribution. The results of the statistical analyses indicate that the only recently active landsliding in the area is along actively eroding river banks, where rotational slumps formed as bluffs are undercut by the river. The analyses further indicate that the old coherent slides and earth flows in the area are spatially related to the 1811-12 earthquake hypocenters and were thus probably triggered by those earthquakes. These results are consistent with findings of other recent investigations of landslides in the area that presented field, historical, and analytical evidence to demonstrate that old landslides in the area formed during the 1811-12 New Madrid earthquakes. Results of the multiple linear regression can also be used to approximate the relative susceptibility of the bluffs in the study area to seismically induced landsliding. ?? 1989.

Jibson, R.W.; Keefer, D.K.

1989-01-01

419

Predictive analysis of thermal distribution and damage in thermotherapy on biological tissue

NASA Astrophysics Data System (ADS)

The use of optical techniques is increasing the possibilities and success of medical praxis in certain cases, either in tissue characterization or treatment. Photodynamic therapy (PDT) or low intensity laser treatment (LILT) are two examples of the latter. Another very interesting implementation is thermotherapy, which consists of controlling temperature increase in a pathological biological tissue. With this method it is possible to provoke an improvement on specific diseases, but a previous analysis of treatment is needed in order for the patient not to suffer any collateral damage, an essential point due to security margins in medical procedures. In this work, a predictive analysis of thermal distribution in a biological tissue irradiated by an optical source is presented. Optical propagation is based on a RTT (Radiation Transport Theory) model solved via a numerical Monte Carlo method, in a multi-layered tissue. Data obtained are included in a bio-heat equation that models heat transference, taking into account conduction, convection, radiation, blood perfusion and vaporization depending on the specific problem. Spatial-temporal differential bio-heat equation is solved via a numerical finite difference approach. Experimental temperature distributions on animal tissue irradiated by laser radiation are shown. From thermal distribution in tissue, thermal damage is studied, based on an Arrhenius analysis, as a way of predicting harmful effects. The complete model can be used for concrete treatment proposals, as a way of predicting treatment effects and consequently decide which optical source parameters are appropriate for the specific disease, mainly wavelength and optical power, with reasonable security margins in the process.

Fanjul-Vélez, Félix; Arce-Diego, José Luis

2007-05-01

420

Multiobjective sensitivity analysis and optimization of a distributed hydrologic model MOBIDIC

NASA Astrophysics Data System (ADS)

Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives which arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for a distributed hydrologic model MOBIDIC, which combines two sensitivity analysis techniques (Morris method and State Dependent Parameter method) with a multiobjective optimization (MOO) approach ϵ-NSGAII. This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina with three objective functions, i.e., standardized root mean square error of logarithmic transformed discharge, water balance index, and mean absolute error of logarithmic transformed flow duration curve, and its results were compared with those with a single objective optimization (SOO) with the traditional Nelder-Mead Simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show: (1) the two sensitivity analysis techniques are effective and efficient to determine the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization; (2) both MOO and SOO lead to acceptable simulations, e.g., for MOO, average Nash-Sutcliffe is 0.75 in the calibration period and 0.70 in the validation period; (3) evaporation and surface runoff shows similar importance to watershed water balance while the contribution of baseflow can be ignored; (4) compared to SOO which was dependent of initial starting location, MOO provides more insight on parameter sensitivity and conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization provides an alternative way for future MOBIDIC modelling.

Yang, J.; Castelli, F.; Chen, Y.

2014-03-01

421

Flow distribution analysis on the cooling tube network of ITER thermal shield

NASA Astrophysics Data System (ADS)

Thermal shield (TS) is to be installed between the vacuum vessel or the cryostat and the magnets in ITER tokamak to reduce the thermal radiation load to the magnets operating at 4.2K. The TS is cooled by pressurized helium gas at the inlet temperature of 80K. The cooling tube is welded on the TS panel surface and the composed flow network of the TS cooling tubes is complex. The flow rate in each panel should be matched to the thermal design value for effective radiation shielding. This paper presents one dimensional analysis on the flow distribution of cooling tube network for the ITER TS. The hydraulic cooling tube network is modeled by an electrical analogy. Only the cooling tube on the TS surface and its connecting pipe from the manifold are considered in the analysis model. Considering the frictional factor and the local loss in the cooling tube, the hydraulic resistance is expressed as a linear function with respect to mass flow rate. Sub-circuits in the TS are analyzed separately because each circuit is controlled by its own control valve independently. It is found that flow rates in some panels are insufficient compared with the design values. In order to improve the flow distribution, two kinds of design modifications are proposed. The first one is to connect the tubes of the adjacent panels. This will increase the resistance of the tube on the panel where the flow rate is excessive. The other design suggestion is that an orifice is installed at the exit of tube routing where the flow rate is to be reduced. The analysis for the design suggestions shows that the flow mal-distribution is improved significantly.

Nam, Kwanwoo; Chung, Wooho; Noh, Chang Hyun; Kang, Dong Kwon; Kang, Kyoung-O.; Ahn, Hee Jae; Lee, Hyeon Gon

2014-01-01

422

Stochastic analysis of transport in hillslopes: Travel time distribution and source zone dispersion

NASA Astrophysics Data System (ADS)

A stochastic model is developed for the analysis of the traveltime distribution f? in a hillslope. The latter is described as made up from a surficial soil underlain by a less permeable subsoil or bedrock. The heterogeneous hydraulic conductivity K is described as a stationary random space function, and the model is based on the Lagrangian representation of transport. A first-order approach in the log conductivity variance is adopted in order to get closed form solutions for the principal statistical moments of the traveltime. Our analysis indicates that the soil is mainly responsible for the early branch of f?, i.e., the rapid release of solute which preferentially moves through the upper soil. The early branch of f? is a power law, with exponent variable between -1 and -0.5; the behavior is mainly determined by unsaturated transport. The subsoil response is slower than that of the soil. The subsoil is mainly responsible for the tail of f?, which in many cases resembles the classic linear reservoir model. The resulting shape for f? is similar to the Gamma distribution. Analysis of the f? moments indicates that the mean traveltime is weakly dependent on the hillslope size. The traveltime variance is ruled by the distribution of distances of the injected solute from the river; the effect is coined as source zone dispersion. The spreading due to the K heterogeneity is less important and obscured by source zone dispersion. The model is tested against the numerical simulation of Fiori and Russo (2008) with reasonably good agreement, with no fitting procedure.

Fiori, A.; Russo, D.; di Lazzaro, M.

2009-08-01

423

Gibbs distribution analysis of temporal correlations structure in retina ganglion cells

We present a method to estimate Gibbs distributions with \\textit{spatio-temporal} constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (\\cite{marre-boustani-etal:09}) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions.

Vasquez, J C; Palacios, A G; Berry, M J; Cessac, B

2011-01-01

424

Directional pair distribution function for diffraction line profile analysis of atomistic models

The concept of the directional pair distribution function is proposed to describe line broadening effects in powder patterns calculated from atomistic models of nano-polycrystalline microstructures. The approach provides at the same time a description of the size effect for domains of any shape and a detailed explanation of the strain effect caused by the local atomic displacement. The latter is discussed in terms of different strain types, also accounting for strain field anisotropy and grain boundary effects. The results can in addition be directly read in terms of traditional line profile analysis, such as that based on the Warren–Averbach method. PMID:23396818

Leonardi, Alberto; Leoni, Matteo; Scardi, Paolo

2013-01-01

425

Analysis of counterfactual quantum key distribution using error-correcting theory

NASA Astrophysics Data System (ADS)

Counterfactual quantum key distribution is an interesting direction in quantum cryptography and has been realized by some researchers. However, it has been pointed that its insecure in information theory when it is used over a high lossy channel. In this paper, we retry its security from a error-correcting theory point of view. The analysis indicates that the security flaw comes from the reason that the error rate in the users' raw key pair is as high as that under the Eve's attack when the loss rate exceeds 50 %.

Li, Yan-Bing

2014-10-01

426

Constraints on spin-dependent parton distributions at large x from global QCD analysis

NASA Astrophysics Data System (ADS)

We investigate the behavior of spin-dependent parton distribution functions (PDFs) at large parton momentum fractions x in the context of global QCD analysis. We explore the constraints from existing deep-inelastic scattering data, and from theoretical expectations for the leading x ? 1 behavior based on hard gluon exchange in perturbative QCD. Systematic uncertainties from the dependence of the PDFs on the choice of parametrization are studied by considering functional forms motivated by orbital angular momentum arguments. Finally, we quantify the reduction in the PDF uncertainties that may be expected from future high-x data from Jefferson Lab at 12 GeV.

Jimenez-Delgado, P.; Avakian, H.; Melnitchouk, W.

2014-11-01

427

Constraints on spin-dependent parton distributions at large x from global QCD analysis

We investigate the behavior of spin-dependent parton distribution functions (PDFs) at large parton momentum fractions x in the context of global QCD analysis. We explore the constraints from existing deep-inelastic scattering data, and from theoretical expectations for the leading x -> 1 behavior based on hard gluon exchange in perturbative QCD. Systematic uncertainties from the dependence of the PDFs on the choice of parametrization are studied by considering functional forms motivated by orbital angular momentum arguments. Finally, we quantify the reduction in the PDF uncertainties that may be expected from future high-x data from Jefferson Lab at 12 GeV.

Jimenez-Delgado, Pedro [JLAB; Avakian, Harut A. [JLAB; Melnitchouk, Wally [JLAB

2014-11-01

428

Methods for the determination of 18 trace elements in sea water by neutron activation analysis have been developed. Of these elements sufficient analyses have been completed for gold, selenium, antimony, silver, cobalt and nickel to permit a discussion of their distributions in the world ocean. The distributions of gold, selenium and antimony are more uniform than those of silver, cobalt

Donald F. Schutz; Karl K. Turekian

1965-01-01

429

We describe the mechanism of distribution measurement of fiber Brillouin spectrum by Brillouin Optical Correlation Domain Analysis (BOCDA), and numerically simulate the distribution of the Brillouin gain spectrum, which is expected to be measured in experiments. The simulation results agree well with the experimental results. We show the verification of the theoretical formulation of spatial resolution, by using the numerical

Toyohiko Yamauchi; Kazuo Hotate

2004-01-01

430

Linear Stability Analysis of a Collisionless Distribution Function for the Force-Free Harris Sheet

NASA Astrophysics Data System (ADS)

A discussion is presented of the first linear stability analysis of the collisionless distribution function recently found by Harrison and Neukirch for the force-free Harris sheet (Physical Review Letters 102, 135003, 2009). Macroscopic instabilities are considered, and the perturbations are assumed to be two-dimensional only. The stability analysis is based on the technique of integration over unperturbed orbits. Similarly to the Harris sheet case (Nuovo Cimento, 23:115, 1962), this is only possible by using approximations to the exact orbits, which are unknown. Furthermore, the approximations for the Harris sheet case cannot be used for the force-free Harris sheet, and so new techniques have to be developed in order to make analytical progress. In addition to the full problem, the long wavelength limit is considered, and the results of the two cases are compared. The dependence of the stability on various equilibrium parameters is investigated.

Wilson, Fiona; Neukirch, Thomas

2013-04-01

431

A Linear Stability Analysis of a Collisionless Distribution Function for the Force-Free Harris Sheet

NASA Astrophysics Data System (ADS)

A discussion is presented of the first linear stability analysis of the collisionless distribution function recently found by Harrison and Neukirch for the force-free Harris sheet (Physical Review Letters 102, 135003, 2009). Macroscopic instabilities are considered, and the perturbations are assumed to be two-dimensional only. The stability analysis is based on the technique of integration over unperturbed orbits by using approximations to the exact orbits, for which analytical expressions are not available.Ffor the force-free Harris sheet, different approximations have to be used compared to the Harris sheet case. Only the long wavelength limit is considered and the dependence of the stability on various equilibrium parameters is investigated.

Neukirch, T.; Wilson, F.

2012-12-01

432

NASA Astrophysics Data System (ADS)

The application of Stoeckli theory to determine pore size distribution (PSD) of activated carbons using high pressure methane adsorption data is explored. Coconut shell was used as a raw material for the preparation of 16 different activated carbon samples. Four samples with higher methane adsorption were selected and nitrogen adsorption on these adsorbents was also investigated. Some differences are found between the PSD obtained from the analysis of nitrogen adsorption isotherms and their PSD resulting from the same analysis using methane adsorption data. It is suggested that these differences may arise from the specific interactions between nitrogen molecules and activated carbon surfaces; therefore caution is required in the interpretation of PSD obtained from the nitrogen isotherm data.

Ahmadpour, A.; Okhovat, A.; Darabi Mahboub, M. J.

2013-06-01

433

New Statistical Methods for Analysis of Large Surveys: Distributions and Correlations

The aim of this paper is to describe new statistical methods for determination of the correlations among and distributions of physical parameters from a multivariate data with general and arbitrary truncations and selection biases. These methods, developed in collaboration with B. Efron of Department of Statistics at Stanford, can be used for analysis of combined data from many surveys with different and varied observational selection criteria. For clarity we will use the luminosity function of AGNs and its evolution to demonstrate the methods. We will first describe the general features of data truncation and present a brief review of past methods of analysis. Then we will describe the new methods and results from simulations testing their accuracy. Finally we will present the results from application of the methods to a sample of quasars.

Vahe' Petrosian

2001-12-19

434

Analysis of flux distribution and core losses in interior permanent magnet motor

The interior permanent magnet (IPM) motor with its robust rotor construction, hybrid torque production nature and flux-weakening capability is suitable for electric vehicle applications where wide speed and torque range is required. At high-speed operations, core losses become an important issue because they affect efficiency and raise operating temperatures. This paper discusses the results of two-dimensional finite element analysis into the relationship between flux distribution and core losses in the IPM motor. The analysis is further supported by flux measurements using search coils installed in an experimental motor. Three methods of predicting the core losses in IPM motor are also investigated. These methods are the empirical formula method, finite element computed waveform method and the search coil induced voltage method.

Tseng, K.J.; Wee, S.B.

1999-12-01

435

NASA Astrophysics Data System (ADS)

Due to ever more efficient and accurate laser scanning technologies, the analysis of 3D point clouds has become an important task in modern photogrammetry and remote sensing. To exploit the full potential of such data for structural analysis and object detection, reliable geometric features are of crucial importance. Since multiscale approaches have proved very successful for image-based applications, efforts are currently made to apply similar approaches on 3D point clouds. In this paper we analyse common geometric covariance features, pinpointing some severe limitations regarding their performance on varying scales. Instead, we propose a different feature type based on shape distributions known from object recognition. These novel features show a very reliable performance on a wide scale range and their results in classification outnumber covariance features in all tested cases.

Blomley, R.; Weinmann, M.; Leitloff, J.; Jutzi, B.

2014-08-01

436

System analysis for the Huntsville Operation Support Center distributed computer system

NASA Technical Reports Server (NTRS)

A simulation model of the NASA Huntsville Operational Support Center (HOSC) was developed. This simulation model emulates the HYPERchannel Local Area Network (LAN) that ties together the various computers of HOSC. The HOSC system is a large installation of mainframe computers such as the Perkin Elmer 3200 series and the Dec VAX series. A series of six simulation exercises of the HOSC model is described using data sets provided by NASA. The analytical analysis of the ETHERNET LAN and the video terminals (VTs) distribution system are presented. An interface analysis of the smart terminal network model which allows the data flow requirements due to VTs on the ETHERNET LAN to be estimated, is presented.

Ingels, F. M.

1986-01-01

437

AN ANALYSIS OF THE DISTRIBUTION OF RAINFALL AND SOME RAINFALL ASSOCIATIONS FOR SELECTED STATIONS IN WESTERN COLOMBIA A Thesis By DAVID GORDON MORRIS Submitted to the Graduate College of the Texas A&X University in partial fulfillment... of the requirements for the degree of MASTER OF SCIENCE May, 1966 Major Subject: Meteorology AN ANALYSIS OF THE DISTRIBUTION OF RAINFALI AND SOME RAINFALL ASSOCIATIONS FOR SELECTED STATIONS IN WESTERN COLOMBIA A Thesis By DAVID GORDON MORRIS Approved...

Morris, David Gordon

2012-06-07

438

NASA Astrophysics Data System (ADS)

The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field- scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides insitu concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions. The National Security Technologies, LLC component of this work is DOE/NV/25946--xxx and was done under contract number DE-AC52-O6NA25946 with the U.S. Department of Energy

Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

2007-12-01

439

Determination and analysis of distribution coefficients of 137Cs in soils from Biscay (Spain).

The distribution coefficient of (137)Cs has been determined in 58 soils from 12 sampling points from Biscay by treating 10 g with 25 ml of an aqueous solution with an activity of 1765 Bq in the radionuclide, by shaking during 64 h and measuring the residual activity with a suitable detector. Soils were characterised by sampling depth, particle size analysis and the usual chemical parameters. Soils were thereafter treated to fix the chemical forms of (137)Cs speciation by successive extractions in order to determine fractions due to exchangeable, associated with carbonates, iron oxide and organic matter fractions, obtaining by difference the amount taken by the rest of the soil constituents. For this research, 16 soils from four points were selected from the previous samples. The greatest mean percentages of (137)Cs sorption were with the rest (69.93), exchangeable (13.17) and organic matter (12.54%) fractions. This paper includes also the calculation of partial distribution coefficients for chemical species as well as relations of distribution coefficients both among them and with soil parameters. PMID:15092865

Elejalde, C; Herranz, M; Legarda, F; Romero, F

2000-10-01

440

Method for analysis and display of distribution of emphysema in CT scans

NASA Astrophysics Data System (ADS)

A novel method for the assessment and display of the distribution of emphysema in low-dose helical CT scans has been developed. The automated system segments the lung volume and estimates the degree of emphysema as a function of slice position within the lung. Eighty low-dose (120 kVp, 40 mA) high-resolution (2.5 mm slice thickness) CT scans were randomly selected from our lung cancer screening program. Three emphysema assessments were performed on each scan: the traditional method of averaging the degree of emphysema on four pre-selected CT slices, the total volumetric percentage of emphysema, and a graphical display of emphysema burden as a function of slice position based on a sliding window algorithm. The traditional four-slice estimates showed a high correlation (0.98) with the total volumetric percentages, yet provided limited spatial information. In those cases with a higher overall percentage of emphysema, the distribution within the lung as quantified by the new method was more skewed than that of less severe cases or normals. Analysis and display of the spatial distribution of emphysema allows for assessment of emphysema burden within each lung zone, which may be useful for quantitating the type of emphysema and the progression of disease over time.

Kostis, William J.; Fluture, Simina C.; Yankelevitz, David F.; Henschke, Claudia I.

2003-05-01

441

PLR analysis of optical packet switch with different packet length distribution

NASA Astrophysics Data System (ADS)

The paper studies the performance of an all-optical packet switch (OPS) for different packet length. The packet loss ratio, considered the OPS without FDL and with FDL buffer, with different length distribution are analyzed under the Poisson arrival process and a burst super exponential arrival process respectively. The experiment and analysis show that the performance of different length packet is influence by the packet arrival process and the buffer size. When the OPS without buffer, the PLR under bursty traffic yields the higher packet loss ratio (PLR). PLR with Poisson arrival packet is not influenced by the packet length distribution, while with the super exponential traffic, the packet length distribution influences the PLR, and the fixed length packet yields higher PLR compared to the variable length exponential packet and the experiential Internet traffic. When the OPS with buffer, PLR with Poisson arrival packet yields lower than with super exponential packet under the same load, and the experiential length super exponential packet leads to highest PLR, while the fixed length Poisson process packet brings on lowest PLR.

Liu, Huanlin; Chen, Qianbin; Pan, Yingjun

2006-09-01

442

Impedance analysis of a tight epithelium using a distributed resistance model.

This paper develops techniques for equivalent circuit analysis of tight epithelia by alternating-current impedance measurements, and tests these techniques on rabbit urinary bladder. Our approach consists of measuring transepithelial impedance, also measuring the DC voltage-divider ratio with a microelectrode, and extracting values of circuit parameters by computer fit of the data to an equivalent circuit model. We show that the commonly used equivalent circuit models of epithelia give significant misfits to the impedance data, because these models (so-called "lumped models") improperly represent the distributed resistors associated with long and narrow spaces such as lateral intercellular spaces (LIS). We develop a new "distributed model" of an epithelium to take account of these structures and thereby obtain much better fits to the data. The extracted parameters include the resistance and capacitance of the apical and basolateral cell membranes, the series resistance, and the ratio of the cross-sectional area to the length of the LIS. The capacitance values yield estimates of real area of the apical and basolateral membranes. Thus, impedance analysis can yield morphological information (configuration of the LIS, and real membrane areas) about a living tissue, independently of electron microscopy. The effects of transport-modifying agents such as amiloride and nystatin can be related to their effects on particular circuit elements by extracting parameter values from impedance runs before and during application of the agent. Calculated parameter values have been validated by independent electrophysiological and morphological measurements. PMID:262419

Clausen, C; Lewis, S A; Diamond, J M

1979-01-01

443

Analysis of adipose tissue distribution using whole-body magnetic resonance imaging

NASA Astrophysics Data System (ADS)

Obesity is an increasing problem in the western world and triggers diseases like cancer, type two diabetes, and cardiovascular diseases. In recent years, magnetic resonance imaging (MRI) has become a clinically viable method to measure the amount and distribution of adipose tissue (AT) in the body. However, analysis of MRI images by manual segmentation is a tedious and time-consuming process. In this paper, we propose a semi-automatic method to quantify the amount of different AT types from whole-body MRI data with less user interaction. Initially, body fat is extracted by automatic thresholding. A statistical shape model of the abdomen is then used to differentiate between subcutaneous and visceral AT. Finally, fat in the bone marrow is removed using morphological operators. The proposed method was evaluated on 15 whole-body MRI images using manual segmentation as ground truth for adipose tissue. The resulting overlap for total AT was 93.7% +/- 5.5 with a volumetric difference of 7.3% +/- 6.4. Furthermore, we tested the robustness of the segmentation results with regard to the initial, interactively defined position of the shape model. In conclusion, the developed method proved suitable for the analysis of AT distribution from whole-body MRI data. For large studies, a fully automatic version of the segmentation procedure is expected in the near future.

Wald, Diana; Schwarz, Tobias; Dinkel, Julien; Delorme, Stefan; Teucher, Birgit; Kaaks, Rudolf; Meinzer, Hans-Peter; Heimann, Tobias

2011-03-01

444

In situ analysis of a bimodal size distribution of superparamagnetic nanoparticles.

The dispersed iron oxide nanoparticles of ferrofluids in aqueous solution are difficult to characterize due to their protective polymer coatings. We report on the bimodal size distribution of superparamagnetic iron oxide nanoparticles found in the MRI contrast agent Resovist, which is a representative example of commercial nanoparticle-based pharmaceutical formulations. The radii of the majority of the nanoparticles (>99%) range from 4 to 13 nm (less than 1% of the particles display radii up to 21 nm). The maxima of the size distributions are at 5.0 and 9.9 nm. The analysis was performed with in situ characterization of Resovist via online coupling of asymmetrical flow field-flow fractionation (A4F) with small-angle X-ray scattering (SAXS) using a standard copper X-ray tube as a radiation source. The outlet of the A4F was directly coupled to a flow capillary on the SAXS instrument. SAXS curves of nanoparticle fractions were recorded at 1-min time intervals. We recommend using the A4F-SAXS coupling as a routine method for analysis of dispersed nanoparticles with sizes in the range of 1-100 nm. It allows a fast and quantitative comparison of different batches without the need for sample preparation. PMID:19117457

Thünemann, Andreas F; Rolf, Simone; Knappe, Patrick; Weidner, Steffen

2009-01-01

445

NASA Astrophysics Data System (ADS)

Vein patterns can be used for accessing, identifying, and authenticating purposes; which are more reliable than classical identification way. Furthermore, these patterns can be used for venipuncture in health fields to get on to veins of patients when they cannot be seen with the naked eye. In this paper, an image acquisition system is implemented in order to acquire digital images of people hands in the near infrared. The image acquisition system consists of a CCD camera and a light source with peak emission in the 880 nm. This radiation can penetrate and can be strongly absorbed by the desoxyhemoglobin that is presented in the blood of the veins. Our method of analysis is composed by several steps and the first one of all is the enhancement of acquired images which is implemented by spatial filters. After that, adaptive thresholding and mathematical morphology operations are used in order to obtain the distribution of vein patterns. The above process is focused on the people recognition through of images of their palm-dorsal distributions obtained from the near infrared light. This work has been directed for doing a comparison of two different techniques of feature extraction as moments and veincode. The classification task is achieved using Artificial Neural Networks. Two databases are used for the analysis of the performance of the algorithms. The first database used here is owned of the Hong Kong Polytechnic University and the second one is our own database.

Castro-Ortega, R.; Toxqui-Quitl, C.; Solís-Villarreal, J.; Padilla-Vivanco, A.; Castro-Ramos, J.

2014-09-01

446

Analysis of Regolith Simulant Ejecta Distributions from Normal Incident Hypervelocity Impact

NASA Technical Reports Server (NTRS)

The National Aeronautics and Space Administration (NASA) has established the Constellation Program. The Constellation Program has defined one of its many goals as long-term lunar habitation. Critical to the design of a lunar habitat is an understanding of the lunar surface environment; of specific importance is the primary meteoroid and subsequent ejecta environment. The document, NASA SP-8013 'Meteoroid Environment Model Near Earth to Lunar Surface', was developed for the Apollo program in 1969 and contains the latest definition of the lunar ejecta environment. There is concern that NASA SP-8013 may over-estimate the lunar ejecta environment. NASA's Meteoroid Environment Office (MEO) has initiated several tasks to improve the accuracy of our understanding of the lunar surface ejecta environment. This paper reports the results of experiments on projectile impact into powdered pumice and unconsolidated JSC-1A Lunar Mare Regolith simulant targets. Projectiles were accelerated to velocities between 2.45 and 5.18 km/s at normal incidence using the Ames Vertical Gun Range (AVGR). The ejected particles were detected by thin aluminum foil targets strategically placed around the impact site and angular ejecta distributions were determined. Assumptions were made to support the analysis which include; assuming ejecta spherical symmetry resulting from normal impact and all ejecta particles were of mean target particle size. This analysis produces a hemispherical flux density distribution of ejecta with sufficient velocity to penetrate the aluminum foil detectors.

Edwards, David L.; Cooke, William; Suggs, Rob; Moser, Danielle E.

2008-01-01

447

Every biological organism relies for its proper function on interactions between a multitude of molecular entities like RNA, proteins, and metabolites. The comprehensive measurement and the analysis of all these entities would therefore provide the basis for our functional and mechanistic understanding of most biological processes. Next to their amount and identity, it is most crucial to also gain information about the subcellular distribution and the flux of the measured compounds between the cellular compartments. That is, we want to understand not only the individual functions of cellular components but also their functional implications within the whole organism. While the analysis of macromolecules like DNA, RNA, and proteins is quite established and robust, analytical techniques for small metabolites, which are prone to diffusion and degradation processes, provide a host of unsolved challenges. The major limitations here are the metabolite conversion and relocation processes. In this protocol we describe a methodological workflow which includes a nonaqueous fractionation method, a fractionated two-phase liquid/liquid extraction protocol, and a software package, which together allow extracting and analyzing starch, proteins, and especially polar and lipophilic metabolites from a single sample towards the estimation of their subcellular distributions. PMID:24057387

Krueger, Stephan; Steinhauser, Dirk; Lisec, Jan; Giavalisco, Patrick

2014-01-01

448

Nonlinear Reduced-Order Analysis with Time-Varying Spatial Loading Distributions

NASA Technical Reports Server (NTRS)

Oscillating shocks acting in combination with high-intensity acoustic loadings present a challenge to the design of resilient hypersonic flight vehicle structures. This paper addresses some features of this loading condition and certain aspects of a nonlinear reduced-order analysis with emphasis on system identification leading to formation of a robust modal basis. The nonlinear dynamic response of a composite structure subject to the simultaneous action of locally strong oscillating pressure gradients and high-intensity acoustic loadings is considered. The reduced-order analysis used in this work has been previously demonstrated to be both computationally efficient and accurate for time-invariant spatial loading distributions, provided that an appropriate modal basis is used. The challenge of the present study is to identify a suitable basis for loadings with time-varying spatial distributions. Using a proper orthogonal decomposition and modal expansion, it is shown that such a basis can be developed. The basis is made more robust by incrementally expanding it to account for changes in the location, frequency and span of the oscillating pressure gradient.

Prezekop, Adam

2008-01-01

449

Distributed Russian Tier-2 - RDIG in Simulation and Analysis of Alice Data From LHC

NASA Astrophysics Data System (ADS)

On the threshold of LHC data there were intensive test and upgrade of GRID application software for all LHC experiments at the top of the modern LCG middleware (gLite). The update of such software for ALICE experiment at LHC, AliEn[1] had provided stable and secure operation of sites developing LHC data. The activity of Russian RDIG (Russian Data Intensive GRID) computer federation which is the distributed Tier-2 centre are devoted to simulation and analysis of LHC data in accordance with the ALICE computing model [2]. Eight sites of this federation interesting in ALICE activity upgrade their middle ware in accordance with requirements of ALICE computing what ensured success of MC production and end-user analysis activity at all eight sites. The result of occupancy and efficiency of each site in the time of LHC operation will be presented in the report. The outline the results of CPU and disk space usage at RDIG sites for the data simulation and analysis of first LHC data from the exposition of ALICE detector [3] will be presented as well. There will be presented also the information about usage of parallel analysis facility based on PROOF [4].

Bogdanov, A.; Jancurova, L.; Kiryanov, A.; Kotlyar, V.; Mitsyn, V.; Lyublev, Y.; Ryabinkin, E.; Shabratova, G.; Smirnov, S.; Stepanova, L.; Urazmetov, W.; Zarochentsev, A.

2011-12-01

450

Application-driven merging and analysis of person trajectories for distributed smart camera networks

NASA Astrophysics Data System (ADS)

Tracking of persons and analysis of their trajectories are important tasks of surveillance systems as they support the monitoring personnel. However, this trend is accompanied by an increasing demand on smarter camera networks carrying out surveillance tasks autonomously. Thus, there is a higher system complexity so that requirements on the video analysis algorithms are increasing as well. In this paper, we present a system concept and application for anonymously gathering, processing and analysis of trajectories in distributed smart camera networks. It allows a multitude of analysis techniques such as inspecting individual properties of the observed movement in real-time. Additionally, the anonymous movement data allows long-term storage and big data analyses for statistical purposes. The system described in this paper has been implemented as prototype system and deployed for proof of concept under real conditions at the entrance hall of the Leibniz University Hannover. It shows an overall stable performance, particularly with respect to significant illumination changes over hours, as well as regarding the reduction of false positives by post processing and trajectory merging performed on top of a panorama based person detection module.

Metzler, Jürgen; Monari, Eduardo; Kuntzsch, Colin

2014-03-01

451

Analysis of large physics data sets is a major computing task at Fermilab. One step in such an analysis involves culling ``interesting`` events via the use of complex query criteria. What makes this unusual is the scale required: 100`s of gigabytes of event data must be scanned at 10`s of megabytes per second for the typical queries that are applied, and data must be extracted from 10`s of terabytes based on the result of the query. The Physics Object Persistency Manager (POPM) system is a solution tailored to this scale of problem. A running POPM environment can support multiple queries in progress, each scanning at rates exceeding 10 megabytes per second, all of which are sharing access to a very large persistent address space distributed across multiple disks on multiple hosts. Specifically, POPM employs the following techniques to permit this scale of performance and access: Persistent objects: Experimental data to be scanned is ``populated`` as a data structure into the persistent address space supported by POPM. C++ classes with a few key overloaded operators provide nearly transparent semantics for access to the persistent storage. Distributed and parallel I/O: The persistent address space is automatically distributed across disks of multiple ``I/O nodes`` within the POPM system. A striping unit concept is implemented in POPM, permitting fast parallel I/O across the storage nodes, even for small single queries. Efficient Shared access: POPM implements an efficient mechanism for arbitration and multiplexing of I/O access among multiple queries on the same or separate compute nodes.

Fischler, M.S.; Isely, M.C.; Nigri, A.M.; Rinaldo, F.J.

1996-01-01

452

NASA Astrophysics Data System (ADS)

The internal flow evolution of the pump was induced with impeller movement. In various conditions, the peak load on centrifugal blade under the change of rotational speed or flow rate was also changed. It would cause an error when inertia load with a safety coefficient (that was difficult to ascertain) was applied in structure design. In order to accurately analyze the impeller stress under various conditions and improve the reliability of pump, based on a mixed flow pump model, the stress distribution characteristic was analyzed under different flow rates and rotational speeds. Based on a three-dimensional calculation model including impeller, guide blade, inlet and outlet, the three-dimension incompressible turbulence flow in the centrifugal pump was simulated by using the standard k-epsilon turbulence model. Based on the sequentially coupled simulation approach, a three-dimensional finite element model of impeller was established, and the fluid-structure interaction method of the blade load transfer was discussed. The blades pressure from flow simulation, together with inertia force acting on the blade, was used as the blade loading on solid surface. The Finite Element Method (FEM) was used to calculate the stress distribution of the blade respectively under inertia load, or fluid load, or combined load. The results showed that the blade stress changed with flow rate and rotational speed. In all cases, the maximum stress on the blade appeared on the pressure side near the hub, and the maximum static stress increased with the decreasing of the flow rate and the increasing of rotational speed. There was a big difference on the static stress when inertia load, fluid load and combined loads was applied respectively. In order to more accurately calculate the stress distribution, the structure analysis should be conducted due to combined loads. The results could provide basis for the stress analysis and structure optimization of pump.

Hu, F. F.; Chen, T.; Wu, D. Z.; Wang, L. Q.

2013-12-01

453

The composition, occurrence, distribution, and possible toxicity of chemical mixtures in the environment are research concerns of the U.S. Geological Survey and others. The presence of specific chemical mixtures may serve as indicators of natural phenomena or human-caused events. Chemical mixtures may also have ecological, industrial, geochemical, or toxicological effects. Chemical-mixture occurrences vary by analyte composition and concentration. Four related computer programs have been developed by the National Water-Quality Assessment Program of the U.S. Geological Survey for research of chemical-mixture compositions, occurrences, distributions, and possible toxicities. The compositions and occurrences are identified for the user-supplied data, and therefore the resultant counts are constrained by the user’s choices for the selection of chemicals, reporting limits for the analytical methods, spatial coverage, and time span for the data supplied. The distribution of chemical mixtures may be spatial, temporal, and (or) related to some other variable, such as chemical usage. Possible toxicities optionally are estimated from user-supplied benchmark data. The software for the analysis of chemical mixtures described in this report is designed to work with chemical-analysis data files retrieved from the U.S. Geological Survey National Water Information System but can also be used with appropriately formatted data from other sources. Installation and usage of the mixture software are documented. This mixture software was designed to function with minimal changes on a variety of computer-operating systems. To obtain the software described herein and other U.S. Geological Survey software, visit http://water.usgs.gov/software/.

Scott, Jonathon C.; Skach, Kenneth A.; Toccalino, Patricia L.

2013-01-01

454

Background In the last decade, the moss Physcomitrella patens has emerged as a powerful plant model system, amenable for genetic manipulations not possible in any other plant. This moss is particularly well suited for plant polarized cell growth studies, as in its protonemal phase, expansion is restricted to the tip of its cells. Based on pollen tube and root hair studies, it is well known that tip growth requires active secretion and high polarization of the cellular components. However, such information is still missing in Physcomitrella patens. To gain insight into the mechanisms underlying the participation of organelle organization in tip growth, it is essential to determine the distribution and the dynamics of the organelles in moss cells. Results We used fluorescent protein fusions to visualize and track Golgi dictyosomes, mitochondria, and peroxisomes in live protonemal cells. We also visualized and tracked chloroplasts based on chlorophyll auto-fluorescence. We showed that in protonemata all four organelles are distributed in a gradient from the tip of the apical cell to the base of the sub-apical cell. For example, the density of Golgi dictyosomes is 4.7 and 3.4 times higher at the tip than at the base in caulonemata and chloronemata respectively. While Golgi stacks are concentrated at the extreme tip of the caulonemata, chloroplasts and peroxisomes are totally excluded. Interestingly, caulonemata, which grow faster than chloronemata, also contain significantly more Golgi dictyosomes and fewer chloroplasts than chloronemata. Moreover, the motility analysis revealed that organelles in protonemata move with low persistency and average instantaneous speeds ranging from 29 to 75?nm/s, which are at least three orders of magnitude slower than those of pollen tube or root hair organelles. Conclusions To our knowledge, this study reports the first quantitative analysis of organelles in Physcomitrella patens and will make possible comparisons of the distribution and dynamics of organelles from different tip growing plant cells, thus enhancing our understanding of the mechanisms of plant polarized cell growth. PMID:22594499

2012-01-01

455

Rainfall extremes: Toward reconciliation after the battle of distributions

NASA Astrophysics Data System (ADS)

This study attempts to reconcile the conflicting results reported in the literature concerning the behavior of peak-over-threshold (POT) daily rainfall extremes and their distribution. By using two worldwide data sets, the impact of threshold selection and record length on the upper tail behavior of POT observations is investigated. The rainfall process is studied within the framework of generalized Pareto (GP) exceedances according to the classical extreme value theory (EVT), with particular attention paid to the study of the GP shape parameter, which controls the heaviness of the upper tail of the GP distribution. A twofold effect is recognized. First, as the threshold decreases, and nonextreme values are progressively incorporated in the POT samples, the variance of the GP shape parameter reduces and the mean converges to positive values denoting a tendency to heavy tail behavior. Simultaneously, the EVT asymptotic hypotheses are less and less realistic, and the GP asymptote tends to be replaced by the Weibull penultimate asymptote whose upper tail is exponential but apparently heavy. Second, for a fixed high threshold, the variance of the GP shape parameter reduces as the record length (number of years) increases, and the mean values tend to be positive, thus denoting again the prevalence of heavy tail behavior. In both cases, i.e., threshold selection and record length effect, the heaviness of the tail may be ascribed to mechanisms such as the blend of extreme and nonextreme values, and fluctuations of the parent distributions. It is shown how these results provide a link between previous studies and pave the way for more comprehensive analyses which merge empirical, theoretical, and operational points of view. This study also provides several ancillary results, such as a set of formulae to correct the bias of the GP shape parameter estimates due to short record lengths accounting for uncertainty, thus avoiding systematic underestimation of extremes which results from the analysis of short time series.

Serinaldi, Francesco; Kilsby, Chris G.

2014-01-01

456

Current (re-)Distribution inside an ITER Full-Size Conductor: a Qualitative Analysis

NASA Astrophysics Data System (ADS)

The comprehension of the current re-distribution phenomenon inside multi-filamentary conductors is a crucial point for the design of ITER-relevant coils, as it is by now assessed that current non-uniformity among cable sub-stages may strongly deteriorate Cable-in-Conduit Conductors (CICC) performances. The only feasible way to get information about the current flowing inside CICC sub-stages is an indirect evaluation by self-field measurements in regions very close to conductor surface. A 7m full-size NbTi conductor (Bus-Bar III) has been used as short-circuit during the test of an ITER Toroidal Field Coil HTS current lead at FzK. Its relatively simple shape and the absence of any other magnetic field source (background coils, etc.), made BBIII one of the most desirable candidate for a reliable measurement of the current distribution under controlled conditions. This is why it has been ad hoc instrumented with different arrangements of Hall-probes (rings and arrays), as well as with transverse and longitudinal voltage taps. This paper gives a qualitative interpretation of the current (re-)distribution events inside the conductor as derived from the analysis of the Hall sensors and the voltage taps signals, during Tcs measurements and as a function of different dI/dt. It has been shown that Hall probes represent a very reliable tool to investigate this issue. In fact, re-distribution phenomena have been clearly observed during transition, and even far before reaching Tcs, when voltage transverse signals had not yet showed any appreciable onset.

di Zenobio, A.; Muzzi, L.; Turtù, S.; Della Corte, A.; Verdini, L.

2006-06-01

457

NASA Astrophysics Data System (ADS)

Distributed watershed models are now widely used in practice to simulate runoff responses at high spatial and temporal resolutions. Counter to this purpose, diagnostic analyses of distributed models currently aggregate performance measures in space and/or time and are thus disconnected from the models' operational and scientific goals. To address this disconnect, this study contributes a novel approach for computing and visualizing time-varying global sensitivity indices for spatially distributed model parameters. The high-resolution model diagnostics employ the method of Morris to identify evolving patterns in dominant model processes at sub-daily timescales over a six-month period. The method is demonstrated on the United States National Weather Service's Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) in the Blue River watershed, Oklahoma, USA. Three hydrologic events are selected from within the six-month period to investigate the patterns in spatiotemporal sensitivities that emerge as a function of forcing patterns as well as wet-to-dry transitions. Surprisingly, events with similar magnitudes and durations exhibit significantly different performance controls in space and time, indicating that the diagnostic inferences drawn from representative events will be heavily biased by the a priori selection of those events. By contrast, this study demonstrates high-resolution time-varying sensitivity analysis, requiring no assumptions regarding representative events and allowing modelers to identify transitions between modeled hydrologic regimes a posteriori. The proposed approach details the dynamics of parameter sensitivity in nearly continuous time, providing critical diagnostic insights into the underlying model processes driving predictions. Furthermore, the approach offers the potential to identify transition points between hydrologic regimes under nonstationarity.

Herman, J. D.; Kollat, J. B.; Reed, P. M.; Wagener, T.

2013-12-01

458

NASA Astrophysics Data System (ADS)

Distributed watershed models are now widely used in practice to simulate runoff responses at high spatial and temporal resolutions. Counter to this purpose, diagnostic analyses of distributed models currently aggregate performance measures in space and/or time and are thus disconnected from the models' operational and scientific goals. To address this disconnect, this study contributes a novel approach for computing and visualizing time-varying global sensitivity indices for spatially distributed model parameters. The high-resolution model diagnostics employ the method of Morris to identify evolving patterns in dominant model processes at sub-daily timescales over a six-month period. The method is demonstrated on the United States National Weather Service's Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) in the Blue River watershed, Oklahoma, USA. Three hydrologic events are selected from within the six-month period to investigate the patterns in spatiotemporal sensitivities that emerge as a function of forcing patterns as well as wet-to-dry transitions. Surprisingly, events with similar magnitudes and durations exhibit significantly different performance controls in space and time, indicating that the diagnostic inferences drawn from representative events will be heavily biased by the a priori selection of those events. By contrast, this study demonstrates high-resolution time-varying sensitivity analysis, requiring no assumptions regarding representative events and allowing modelers to identify transitions between modeled hydrologic regimes a posteriori. The proposed approach details the dynamics of parameter sensitivity in nearly continuous time, providing critical diagnostic insights into the underlying model processes driving predictions. Furthermore, the approach offers the potential to identify transition points between hydrologic regimes under nonstationarity.

Herman, J. D.; Kollat, J. B.; Reed, P. M.; Wagener, T.

2013-08-01

459

NASA Astrophysics Data System (ADS)

The Peruvian anchovy or anchoveta ( Engraulis ringens) supports the highest worldwide fishery landings and varies in space and time over many scales. Here we present the first comprehensive sub-mesocale study of anchoveta distribution in relation to the environment. During November 2004, we conducted a behavioural ecology survey off central Peru and used a series of observational and sampling tools including SST and CO 2 sensors, Niskin bottles, CTD probes, zooplankton sampling, stomach content analysis, echo-sounder, multibeam sonar, and bird observations. The sub-mesoscale survey areas were chosen from mesoscale acoustic surveys. A routine coast-wide (?2000 km) acoustic survey performed just after the sub-mesoscale surveys, provided information at an even larger population scale. The availability of nearly concurrent sub-mesoscale, mesoscale and coast-wide information on anchoveta distribution allowed for a unique multi-scale synthesis. At the sub-mesoscale (100s m to km) physical processes (internal waves and frontogenesis) concentrated plankton into patches and determined anchoveta spatial distribution. At the mesoscale (10s km) location relative to the zone of active upwelling (and age of the upwelled water) and the depth of the oxycline had strong impacts on the anchoveta. Finally, over 100s km the size of the productive area, as defined by the upwelled cold coastal waters, was the determining factor. We propose a conceptual view of the relative importance of social behaviour and environmental (biotic and abiotic) processes on the spatial distribution of anchoveta. Our ecological space has two y-axis; one based on self-organization (social behaviour), and the other based on the environmental processes. At scales from the individual (10s cm), to the nucleus (m), social behaviour (e.g. the need to school) drives spatial organization. At scales larger than the school, environmental forces are the main driver of fish distribution. The conceptual ecosystem models presented in this paper may provide the final links needed to develop accurate forecasts of the spatial distribution of anchoveta over multiple scales.

Bertrand, Arnaud; Gerlotto, François; Bertrand, Sophie; Gutiérrez, Mariano; Alza, Luis; Chipollini, Andres; Díaz, Erich; Espinoza, Pepe; Ledesma, Jesús; Quesquén, Roberto; Peraltilla, Salvador; Chavez, Francisco

2008-10-01

460

NASA Astrophysics Data System (ADS)

Understanding the interplay of structural disorder and strength properties at various length scales can lead to improvements in the strength reliability of heterogeneous brittle materials. Various studies in ordered fiber- matrix composites have shown the existence of critical clusters of breaks and macroscopic weak-link scaling behavior. The fiber network in paper is structurally disordered. We verify experimentally that the tensile strength of newsprint samples follows weak-link scaling and obtain an estimate for the link and critical-cluster sizes. However, a slight nonlinear behavior is observed in the Weibull plots of the experimental strength distributions. We propose that this is due to mesoscopic structural disorder (e.g., at length scales between millimeters and centimeters),