Y. Wang; Y. C. Chan; Z. L. Gui; D. P. Webb; L. T. Li
1997-01-01
The Weibull distribution has been previously applied to the mechanical and dielectric failures of ceramics. In this paper, it is confirmed by experiment that this data treatment method is also valid for application in the dielectric failure of multilayer ceramic capacitors (MLCs) which have undergone screening. The Weibull modulus is found to be a useful parameter indicating the sharpness of
Binary data regression: Weibull distribution
NASA Astrophysics Data System (ADS)
Caron, Renault; Polpo, Adriano
2009-12-01
The problem of estimation in binary response data has receivied a great number of alternative statistical solutions. Generalized linear models allow for a wide range of statistical models for regression data. The most used model is the logistic regression, see Hosmer et al. [6]. However, as Chen et al. [5] mentions, when the probability of a given binary response approaches 0 at a different rate than it approaches 1, symmetric linkages are inappropriate. A class of models based on Weibull distribution indexed by three parameters is introduced here. Maximum likelihood methods are employed to estimate the parameters. The objective of the present paper is to show a solution for the estimation problem under the Weibull model. An example showing the quality of the model is illustrated by comparing it with the alternative probit and logit models.
Bayes Estimation for the Marshall-Olkin Bivariate Weibull Distribution
Kundu, Debasis
Bayes Estimation for the Marshall-Olkin Bivariate Weibull Distribution Debasis Kundu1 & Arjun K. Gupta2 Abstract In this paper, we consider the Bayesian analysis of the Marshall-Olkin bivariate Weibull. This is a generalization of the Marshall-Olkin bivariate exponential dis- tribution. It is well known that the maximum
Estimation problems associated with the Weibull distribution
Bowman, K O; Shenton, L R
1981-09-01
Series in descending powers of the sample size are developed for the moments of the coefficient of variation v* for the Weibull distribution F(t) = 1 -exp(-(t/b)/sup c/). A similar series for the moments of the estimator c* of the shape parameter c are derived from these. Comparisons are made with basic asymptotic assessments for the means and variances. From the first four moments, approximations are given to the distribution of v* and c*. In addition, an almost unbiased estimator of c is given when a sample is provided with the value of v*. Comments are given on the validity of the asymptotically normal assessments of the distributions.
Independent Orbiter Assessment (IOA): Weibull analysis report
NASA Technical Reports Server (NTRS)
Raffaelli, Gary G.
1987-01-01
The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.
Estimation of Wind Power Potential Using Weibull Distribution
Asir Genc; Murat Erisoglu; Ahmet Pekgor; Galip Oturanc; Arif Hepbasli; Koray Ulgen
2005-01-01
The main objective of the present study is to estimate wind power potential using the two Weibull parameters of the wind speed distribution function, the shape parameter k (dimensionless) and the scale parameter c (m\\/s). In this regard, a methodology that uses three various techniques (maximum likelihood, least squares, and method of moments) for estimating the Weibull parameters was given
Discriminating between the Weibull and log-normal distributions
Debasis Kundu; Anubhav Manglick
2004-01-01
Log-Normal and Weibull distributions are the most popular distributions for modeling skewed data. In this paper, we consider the ratio of the maximized likelihood in choosing between the two distributions. The asymptotic distribution of the logarithm of the maximized likeli- hood ratio has been obtained. It is observed that the asymptotic distribution is independent of the unknown parameters. The asymptotic
FITTING WEIBULL AND LOGNORMAL DISTRIBUTIONS TO MEDIUM-DENSITY FIBERBOARD FIBER AND WOOD
FITTING WEIBULL AND LOGNORMAL DISTRIBUTIONS TO MEDIUM-DENSITY FIBERBOARD FIBER AND WOOD PARTICLE, the lognormal distribution fit the data, while the Weibull distribution did not. For three of the samples, the Weibull fit the data, while the lognormal did not. For two of the samples, both the lognormal and Weibull
Bayesian estimation of life parameters in the Weibull distribution.
NASA Technical Reports Server (NTRS)
Canavos, G. C.; Tsokos, C. P.
1973-01-01
Development of a Bayesian analysis of the scale and shape parameters in the Weibull distribution and the corresponding reliability function with respect to the usual life-testing procedures. For the scale parameter theta, Bayesian estimates of theta and reliability are obtained for the uniform, exponential, and inverted gamma prior probability densities. Bhattacharya's results (1967) for the one-parameter exponential life-testing distribution are reduced to a special case of these results. A fully Bayesian analysis of both the scale and shape parameters is developed by assuming independent prior distributions; since in the latter case, analytical tractability is not possible, Bayesian estimates are obtained through a conjunction of Monte Carlo simulation and numerical-integration techniques. In both cases, a computer simulation is carried out, and a comparison is made between the Bayesian and the corresponding minimum-variance unbiased, or maximum likelihood, estimates. As expected, the Bayesian estimates are superior.
Weibull parameters for wind speed distribution in Saudia Arabia
Rehman, S.; Halawani, T.O.; Husain, T. (King Fahd Univ. of Petroleum Minerals, Dhahran (Saudi Arabia))
1994-12-01
The shape and scale parameters of a Weibull density distribution function are calculated for 10 locations in Saudi Arabia. The daily mean wind speed data from 1970 to mid-1990 are used for this purpose. It is found that the numerical values of the shape parameter vary between 1.7 and 2.7, whereas the value of the scale parameter is found to vary between 3 and 6. It is also concluded from this study that wind data are very well represented by the Weibull distribution function.
The Subexponential Product Convolution of Two Weibull-type Distributions
Tang, Qihe
experience tells that the product convolution is usually much more intractable than the sum convolutionThe Subexponential Product Convolution of Two Weibull-type Distributions Yan Liu School of the product X1X2, called the product convolution of F1 and F2, belongs to the class S and, hence
Fit of first order thermoluminescence glow peaks using the Weibull distribution function.
Pagonis, V; Mian, S M; Kitis, G
2001-01-01
A new thermoluminescence glow curve deconvolution (GCD) function is introduced which accurately describes first order thermoluminescence (TL) curves. The new GCD function is found to be accurate for first order TL peaks with a wide variety of the values of the TL kinetic parameters E and s. The 3-parameter Weibull probability function is used with the function variables being the maximum peak intensity (Im), the temperature of the maximum peak intensity (Tm) and the Weibull width parameter b. An analytical expression is derived from which the activation energy E can be calculated as a function of Tm and the Weibull width parameter b. The accuracy of the Weibull fit was tested using the ten reference glow curves of the GLOCANIN intercomparison program and the Weibull distribution was found to be highly effective in describing both single and complex TL glow curves. The goodness of fit of the Weibull function is described by the Figure of Merit (FOM) which is found to be of comparable accuracy to the best FOM values of the GLOCANIN program. The FOM values are also comparable to the FOM values obtained using the recently published GCD functions of Kitis et al. It is found that the TL kinetic analysis of complex first-order TL glow curves can be performed with high accuracy and speed by using commercially available software packages. PMID:11548321
NASA Technical Reports Server (NTRS)
Giuntini, Michael E.; Giuntini, Ronald E.
1991-01-01
A Bayesian inference process for system logistical planning is presented which provides a method for incorporating actual failures with prediction data for an ongoing and improving reliability estimates. The process uses the Weibull distribution, and provides a means for examining and updating logistical and maintenance support needs.
Weibull parameters for wind speed distribution in Saudia Arabia
S. Rehman; T. O. Halawani; T. Husain
1994-01-01
The shape and scale parameters of a Weibull density distribution function are calculated for 10 locations in Saudi Arabia. The daily mean wind speed data from 1970 to mid-1990 are used for this purpose. It is found that the numerical values of the shape parameter vary between 1.7 and 2.7, whereas the value of the scale parameter is found to
Table for estimating parameters of Weibull distribution
NASA Technical Reports Server (NTRS)
Mann, N. R.
1971-01-01
Table yields best linear invariant /BLI/ estimates for log of reliable life under censored life tests, permitting reliability estimations in failure analysis of items with multiple flaws. These BLI estimates have uniformly smaller expected loss than Gauss-Markov best linear unbiased estimates.
QMPE: Estimating Lognormal, Wald and Weibull RT distributions with a parameter
Cousineau, Denis
QMPE: Estimating Lognormal, Wald and Weibull RT distributions with a parameter dependent lower), the Lognormal, Wald and Weibull distributions. Estimation can be performed using either the standard maximum distributions: the two-parameter Gumbel distribution and the three- parameter shifted Lognormal, shifted Wald
Understanding Web Browsing Behaviors through Weibull Analysis of Dwell Time
Dumais, Susan
Understanding Web Browsing Behaviors through Weibull Analysis of Dwell Time Chao Liu Microsoft on a Web page, and furthermore, what the dis- tribution tells us about the underlying browsing behaviors. In this paper, we draw an analogy between abandoning a page during Web browsing and a system failure in reliabil
NASA Astrophysics Data System (ADS)
Goh, Segun; Kwon, H. W.; Choi, M. Y.
2014-06-01
We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.
Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.
2012-01-01
A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-Ichi
2007-03-01
A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.
Predictive Failure of Cylindrical Coatings Using Weibull Analysis
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2002-01-01
Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.
Is the Weibull distribution really suited for wind statistics modeling and wind power evaluation?
Drobinski, Philippe
2012-01-01
Wind speed statistics is generally modeled using the Weibull distribution. This distribution is convenient since it fully characterizes analytically with only two parameters (the shape and scale parameters) the shape of distribution and the different moments of the wind speed (mean, standard deviation, skewness and kurtosis). This distribution is broadly used in the wind energy sector to produce maps of wind energy potential. However, the Weibull distribution is based on empirical rather than physical justification and might display strong limitations for its applications. The philosophy of this article is based on the modeling of the wind components instead of the wind speed itself. This provides more physical insights on the validity domain of the Weibull distribution as a possible relevant model for wind statistics and the quantification of the error made by using such a distribution. We thereby propose alternative expressions of more suited wind speed distribution.
Bayesian sequential reliability for Weibull and related distributions
Dongchu Sun; James O. Berger
1994-01-01
Assume that the probability density function for the lifetime of a newly designed product has the form: [H'(t)\\/Q(?)] exp{-H(t)\\/Q(?)}. The Exponentiale(?), Rayleigh, WeibullW(?, ß) and Pareto pdf's are special cases.Q(?) will be assumed to have an inverse Gamma prior. Assume thatm independent products are to be tested with replacement. A Bayesian Sequential Reliability Demonstration Testing plan is used to eigher
Bayesian Weibull reliability estimation
DANIEL I. DE SOUZA JR; LEONARD R. LAMBERSON
1995-01-01
The Weibull distribution is widely used as a failure model, particularly for mechanical components. This distribution is rich in shape and requires a fairly large sample size to produce accurate statistical estimators, particularly for the lower percentiles, as is usually required for a reliability analysis. In practice, sample sizes are almost always small and subjective judgement is applied, aided by
NASA Astrophysics Data System (ADS)
Bertalan, Zsolt; Shekhawat, Ashivni; Sethna, James P.; Zapperi, Stefano
2014-09-01
The statistical properties of fracture strength of brittle and quasibrittle materials are often described in terms of the Weibull distribution. However, the weakest-link hypothesis, commonly used to justify it, is expected to fail when fracture occurs after significant damage accumulation. Here we show that this implies that the Weibull distribution is unstable in a renormalization-group sense for a large class of quasibrittle materials. Our theoretical arguments are supported by numerical simulations of disordered fuse networks. We also find that for brittle materials such as ceramics, the common assumption that the strength distribution can be derived from the distribution of preexisting microcracks by using Griffith's criteria is invalid. We attribute this discrepancy to crack bridging. Our findings raise questions about the applicability of Weibull statistics to most practical cases.
An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
ERIC Educational Resources Information Center
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength
Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine
2014-01-01
Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695
NASA Astrophysics Data System (ADS)
Sun, Gengzhi; Pang, John H. L.; Zhou, Jinyuan; Zhang, Yani; Zhan, Zhaoyao; Zheng, Lianxi
2012-09-01
Fundamental studies on the effects of strain rate and size on the distribution of tensile strength of carbon nanotube (CNT) fibers are reported in this paper. Experimental data show that the mechanical strength of CNT fibers increases from 0.2 to 0.8 GPa as the strain rate increases from 0.00001 to 0.1 (1/s). In addition, the influence of fiber diameter at low and high strain rate conditions was investigated further with statistical analysis. A modified Weibull distribution model for characterizing the tensile strength distribution of CNT fibers taking into account the effect of strain rate and fiber diameter is proposed.
NASA Astrophysics Data System (ADS)
Drobinski, Philippe; Coulais, Corentin; Jourdier, Bénédicte
2015-10-01
Wind-speed statistics are generally modelled using the Weibull distribution. However, the Weibull distribution is based on empirical rather than physical justification and might display strong limitations for its applications. Here, we derive wind-speed distributions analytically with different assumptions on the wind components to model wind anisotropy, wind extremes and multiple wind regimes. We quantitatively confront these distributions with an extensive set of meteorological data (89 stations covering various sub-climatic regions in France) to identify distributions that perform best and the reasons for this, and we analyze the sensitivity of the proposed distributions to the diurnal to seasonal variability. We find that local topography, unsteady wind fluctuations as well as persistent wind regimes are determinants for the performances of these distributions, as they induce anisotropy or non-Gaussian fluctuations of the wind components. A Rayleigh-Rice distribution is proposed to model the combination of weak isotropic wind and persistent wind regimes. It outperforms all other tested distributions (Weibull, elliptical and non-Gaussian) and is the only proposed distribution able to catch accurately the diurnal and seasonal variability.
Cheol Nam; Woan Hwang; Dong-Seong Sohn
1998-01-01
A statistical failure analysis is performed to obtain some insight into the rupture behavior of the metallic U-10Zr\\/HT-9 fast reactor fuel pins which were irradiated in X447 subassembly of EBR-II reactor. Generally, the primary factor that contributes to metallic fuel pin failure is believed to be plenum pressure buildup due to fission gas release. However, the calculated cumulative damage fraction
Exponentiated Weibull distribution family under aperture averaging Gaussian beam waves: comment.
Yura, H T; Rose, T S
2012-08-27
Recently, an exponentiated Weibull distribution model was presented for describing the effects of aperture averaging on scintillation of Gaussian beams propagating through atmospheric turbulence. The model uses three parameters that are derived from physical quantities so that in principle the model could be used to predict optical link performance. After reviewing this model, however, we find several inconsistencies that render it unusable for this purpose under any scintillation conditions. PMID:23037115
We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...
Statistical analysis of bivariate failure time data with Marshall–Olkin Weibull models
Li, Yang; Sun, Jianguo; Song, Shuguang
2013-01-01
This paper discusses parametric analysis of bivariate failure time data, which often occur in medical studies among others. For this, as in the case of univariate failure time data, exponential and Weibull models are probably the most commonly used ones. However, it is surprising that there seem no general estimation procedures available for fitting the bivariate Weibull model to bivariate right-censored failure time data except some methods for special situations. We present and investigate two general but simple estimation procedures, one being a graphical approach and the other being a marginal approach, for the problem. An extensive simulation study is conducted to assess the performances of the proposed approaches and shows that they work well for practical situations. An illustrative example is provided.
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2013-01-01
Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives
Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng
2014-06-20
We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels. PMID:24979434
Effects of thermal cycling and surface roughness on the Weibull distribution of porcelain strength.
Nakamura, Yoshiharu; Hojo, Satoru; Sato, Hideaki
2009-07-01
The objective of this study was to test the hypothesis that thermal cycling weakens the flexural strength of porcelain. Specimens of Deguceram Gold and Vita Omega 900 were tested in four groups of 30 specimens each: in the original glazed condition versus being ground with 1000-grit, 600-grit, and 100-grit silicon carbide abrasives. Corresponding to these four types of surface treatments, four groups of 30 specimens per group received 5,000 times of thermal cycling. Flexural strength was measured using a four-point flexural test, and Weibull modulus was calculated. Within each type of surface treatment, the thermal cycling treatment did not result in any decrease in flexural strength although it caused the Weibull modulus to become smaller - except for the control and thermal-cycled groups of 600-grit surface treatment. PMID:19721280
Discriminating Among the Log-Normal, Weibull, and Generalized Exponential Distributions
Arabin Kumar Dey; Debasis Kundu
2009-01-01
In this paper we consider the model selection\\/ discrimination among the three important lifetime distributions. All these three distributions have been used quite efiectively to analyze lifetime data in the reliability analysis. We study the probability of correct selection using the maximized likelihood method, as it has been used in the literature. We further compute the asymptotic probability of correct
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Estimating the Parameters of the Marshall Olkin Bivariate Weibull
Kundu, Debasis
Estimating the Parameters of the Marshall Olkin Bivariate Weibull Distribution by EM Algorithm Debasis Kundu & Arabin Kumar Dey Abstract In this paper we consider the Marshall-Olkin bivariate Weibull distribution. The Marshall-Olkin bivariate Weibull distribution is a singular distribution, whose both
Analysis of interval-censored data with Weibull lifetime distribution
Kundu, Debasis
. In our simulation experiments it is observed that the Newton-Raphson method may not converge many times algorithm; Gibbs sampling; HPD Credible interval; Lind- ley's approximation; Importance Sampling. SQC
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1990-01-01
The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.
Jo, Byung Wan; Chakraborty, Sumit; Kim, Heon
2015-09-01
This data article provides a comparison data for nano-cement based concrete (NCC) and ordinary Portland cement based concrete (OPCC). Concrete samples (OPCC) were fabricated using ten different mix design and their characterization data is provided here. Optimization of curing time using the Weibull distribution model was done by analyzing the rate of change of compressive strength of the OPCC. Initially, the compressive strength of the OPCC samples was measured after completion of four desired curing times. Thereafter, the required curing time to achieve a particular rate of change of the compressive strength has been predicted utilizing the equation derived from the variation of the rate of change of compressive strength with the curing time, prior to the optimization of the curing time (at the 99.99% confidence level) using the Weibull distribution model. This data article complements the research article entitled "Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model" [1]. PMID:26217804
Weibull-Based Design Methodology for Rotating Aircraft Engine Structures
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry
2002-01-01
The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.
Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha
2007-11-01
Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns. PMID:17869362
Robust Fitting of a Weibull Model with Optional Censoring
Yang, Jingjing; Scott, David W.
2013-01-01
The Weibull family is widely used to model failure data, or lifetime data, although the classical two-parameter Weibull distribution is limited to positive data and monotone failure rate. The parameters of the Weibull model are commonly obtained by maximum likelihood estimation; however, it is well-known that this estimator is not robust when dealing with contaminated data. A new robust procedure is introduced to fit a Weibull model by using L2 distance, i.e. integrated square distance, of the Weibull probability density function. The Weibull model is augmented with a weight parameter to robustly deal with contaminated data. Results comparing a maximum likelihood estimator with an L2 estimator are given in this article, based on both simulated and real data sets. It is shown that this new L2 parametric estimation method is more robust and does a better job than maximum likelihood in the newly proposed Weibull model when data are contaminated. The same preference for L2 distance criterion and the new Weibull model also happens for right-censored data with contamination. PMID:23888090
Harris, S.; Gross, R.; Mitchell, E.
2011-01-18
The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.
Finite-size effects on return interval distributions for weakest-link-scaling systems
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios T.; Petrakis, Manolis P.; Kaniadakis, Giorgio
2014-05-01
The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the ?-Weibull distribution. The upper tail of the ?-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the ?-Weibull distribution decreases linearly after a waiting time ?c?n1/m, where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the ? Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the ?-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems.
Unified physics of stretched exponential relaxation and Weibull fracture statistics
NASA Astrophysics Data System (ADS)
Mauro, John C.; Smedskjaer, Morten M.
2012-12-01
The complicated nature of materials often necessitates a statistical approach to understanding and predicting their underlying physics. One such example is the empirical Weibull distribution used to describe the fracture statistics of brittle materials such as glass and ceramics. The Weibull distribution adopts the same mathematical form as proposed by Kohlrausch for stretched exponential relaxation. Although it was also originally proposed as a strictly empirical expression, stretched exponential decay has more recently been derived from the Phillips diffusion-trap model, which links the dimensionless stretching exponent to the topology of excitations in a glassy network. In this paper we propose an analogous explanation as a physical basis for the Weibull distribution, with an ensemble of flaws in the brittle material serving as a substitute for the traps in the Phillips model. One key difference between stretched exponential relaxation and Weibull fracture statistics is the effective dimensionality of the system. We argue that the stochastic description of the flaw space in the Weibull distribution results in a negative dimensionality, which explains the difference in magnitude of the dimensionless Weibull modulus compared to the stretching relaxation exponent.
Weibull crack density coefficient for polydimensional stress states
NASA Technical Reports Server (NTRS)
Gross, Bernard; Gyekenyesi, John P.
1989-01-01
A structural ceramic analysis and reliability evaluation code has recently been developed encompassing volume and surface flaw induced fracture, modeled by the two-parameter Weibull probability density function. A segment of the software involves computing the Weibull polydimensional stress state crack density coefficient from uniaxial stress experimental fracture data. The relationship of the polydimensional stress coefficient to the uniaxial stress coefficient is derived for a shear-insensitive material with a random surface flaw population.
Bayes Estimation for the Block and Basu Bivariate and Multivariate Weibull
Kundu, Debasis
Bayes Estimation for the Block and Basu Bivariate and Multivariate Weibull Distributions Biswabrata inference of the unknown parameters of the Block and Basu bivariate Weibull distribution. The Bayes. It is observed that the Bayes esti- mators of the unknown parameters cannot be obtained in explicit forms. We
Analysis of neuronal soma size distributions.
Oyster, C W; Takahashi, E S; Hurst, D C
1982-11-01
Neuronal soma size distributions are often very skewed, leading to difficulty in their characterization and in the application of statistics commonly used to describe normally distributed variables. A particular family of curves, called Weibull functions, can be shown to fit cell size data extremely well. These functions not only characterize these skewed distributions, but do so in a way which permits powerful, sensitive comparisons of differences between distributions. Because of the flexibility of the Weibull functions, they may be used to describe a variety of morphological attributes. Several applications of these functions to neuroanatomical data are described using methods which require only graph paper and hand calculator for curve fitting. An extension of the basic curve fitting method permits mixtures of Weibull functions to be used in describing multimodal size histograms. PMID:7154713
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the ? value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when ? and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the ? parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the ? value in saccharification performance assessment were discussed. PMID:26121186
Estimating System Reliability of Competing Weibull Failures with Censored Sampling
A. M. Abd-Elfattah; Marwa O. Mohamed; Saudia Arabia
2009-01-01
In this paper, we consider the estimation of R = P(Y < X) where X and Y have two independent Weibull distributions with dierent scale para- meters and the same shape parameter. We used dierent methods for estimating R. Assuming that the common shape parameter is known, the maximum like- lihood, uniformly minimum variance unbiased and Bayes estimators for R
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Gyekenyesi, John P.
1988-01-01
The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600MPa and 30-50°C with a treatment time up to 20min. A complete destruction of yeasts and molds was obtained at 500MPa/50°C/15min; whereas no counts were detected for TC and LAB at 300MPa/30°C/15min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600MPa/50°C/20min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (?) of the survival curve, was obtained in case of YM (?<1); whereas a shouldering effect (?>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (?, min), respectively. A higher ? value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. PMID:26202323
NASA Astrophysics Data System (ADS)
Symons, Philip C.; Weaver, Robert D.
A discussion is presented of a microcomputer software package, WEIPLOT, that allows the user to fit observed failure data to Weibull distributions. The data are fitted and the results displayed using the Weibull characteristic life and shape factor. The software (for IBM-AT computers or clones) allows data to be entered from the keyboard or from a disk file. WEIPLOT allows the raw data to be displayed on screen on a linear time scale in addition to the Weibull plot. The plots can be printed using dot matrix or laser printers. The data can be fitted using least-squares or by-eye methods. WEIPLOT is user friendly and prevents attempts to fit non-Weibull distributions, e.g., data sets in which no failures have yet been observed or there has been only one failure, or a set in which all failures occurred at the same time. Confidence bands around the estimates are also calculated.
NASA Technical Reports Server (NTRS)
Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.
1990-01-01
This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.
Fanfani, Alessandra; Afaq, Anzar; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; /INFN, Bologna /Bologna U. /Rutherford
2010-03-20
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
Cherkasov, Artem; Ho Sui, Shannan J; Brunham, Robert C; Jones, Steven JM
2004-01-01
Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter ? for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics. PMID:15274750
Measuring the Weibull modulus of microscope slides
NASA Technical Reports Server (NTRS)
Sorensen, Carl D.
1992-01-01
The objectives are that students will understand why a three-point bending test is used for ceramic specimens, learn how Weibull statistics are used to measure the strength of brittle materials, and appreciate the amount of variation in the strength of brittle materials with low Weibull modulus. They will understand how the modulus of rupture is used to represent the strength of specimens in a three-point bend test. In addition, students will learn that a logarithmic transformation can be used to convert an exponent into the slope of a straight line. The experimental procedures are explained.
Bartsch, R.R.
1995-09-01
Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.
Legger, Federica; The ATLAS collaboration
2015-01-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...
Transmission overhaul and replacement predictions using Weibull and renewel theory
NASA Technical Reports Server (NTRS)
Savage, M.; Lewicki, D. G.
1989-01-01
A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.
Modeling of Geometric Size Distribution of Almond
Mahmood Mahmoodi; Javad Khazaei; Narjes Mohamadi
2010-01-01
The purpose of this study was modeling the mass and size distribution of three varieties of almond and its kernel (seed) using the Weibull distribution function. Furthermore some physical properties of seeds were measured using image processing technique. A two-parameter Weibull distribution function was chosen for modeling the size and mass distributions. The Weibull distribution of width was better modeled
NASA Astrophysics Data System (ADS)
Proussevitch, Alexander A.; Sahagian, Dork L.; Carlson, William D.
2007-07-01
A new analytical technique for the statistical analysis of bubble populations in volcanic rocks [Proussevitch, A.A., Sahagian, D.L. and Tsentalovich, E.P., 2007-this issue. Statistical analysis of bubble and crystal size distributions: Formulations and procedures. J. Volc. Geotherm. Res.] has been applied to a collection of Colorado Plateau basalts (96 samples). A variety of mono- and polymodal distributions has been found in the samples, all of which belong to the logarithmic family of statistical functions. Most samples have bimodal log normal distributions, while the others are represented by mono- or bimodal log logistic, and Weibull distributions. We have grouped the observed distributions into 11 groups depending on distribution types, mode location, and intensity. The nature of the curves within these groups can be interpreted as evolution of vesiculation processes. We conclude that within bimodal log normal distributions, the mode of smaller bubbles is the result of a second nucleation and growth event in a lava flow after eruption. In the case of log logistic distributions the larger mode results from coalescence of bubbles. Coalescence processes are reflected in growth of a larger mode and decreasing bubble number density. Another style of population evolution leads to a monomodal Weibull (or exponential) distribution as a result of superposition of multiple log normal distributions in which the modes are comparable in size and intensity. These various population distribution styles can be interpreted with an understanding of vesiculation processes that can be gained through appropriate numerical models of coalescence and population evolution. The applicable vesiculation processes include: a) a single nucleation-growth event, b) continuous multiple nucleation-growth events, c) coalescence, and d) Ostwald ripening.
Survival extrapolation using the poly-Weibull model
Lunn, David; Sharples, Linda D
2015-01-01
Recent studies of (cost-) effectiveness in cardiothoracic transplantation have required estimation of mean survival over the lifetime of the recipients. In order to calculate mean survival, the complete survivor curve is required but is often not fully observed, so that survival extrapolation is necessary. After transplantation, the hazard function is bathtub-shaped, reflecting latent competing risks which operate additively in overlapping time periods. The poly-Weibull distribution is a flexible parametric model that may be used to extrapolate survival and has a natural competing risks interpretation. In addition, treatment effects and subgroups can be modelled separately for each component of risk. We describe the model and develop inference procedures using freely available software. The methods are applied to two problems from cardiothoracic transplantation. PMID:21937472
Moment series for the coefficient of variation in Weibull sampling
Bowman, K.O.; Shenton, L.R.
1981-01-01
For the 2-parameter Weibull distribution function F(t) = 1 - exp(-t/b)/sup c/, t > 0, with c and b positive, a moment estimator c* for c is the solution of the equationGAMMA(1 + 2/c*)/GAMMA/sup 2/ (1 + 1/c*) = 1 + v*/sup 2/ where v* is the coefficient of variation in the form ..sqrt..m/sub 2//m/sub 1/', m/sub 1/' being the sample mean, m/sub 2/ the sample second central moment (it is trivial in the present context to replace m/sub 2/ by the variance). One approach to the moments of c* (Bowman and Shenton, 1981) is to set-up moment series for the scale-free v*. The series are apparently divergent and summation algorithms are essential; we consider methods due to Levin (1973) and one, introduced ourselves (Bowman and Shenton, 1976).
The distribution of first-passage times and durations in FOREX and future markets
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting time. We find that our distribution is applicable as long as durations follow a Weibull law for short times and do not have too heavy a tail.
Fracture strength of ultrananocrystalline diamond thin films—identification of Weibull parameters
NASA Astrophysics Data System (ADS)
Espinosa, H. D.; Peng, B.; Prorok, B. C.; Moldovan, N.; Auciello, O.; Carlisle, J. A.; Gruen, D. M.; Mancini, D. C.
2003-11-01
The fracture strength of ultrananocrystalline diamond (UNCD) has been investigated using tensile testing of freestanding submicron films. Specifically, the fracture strength of UNCD membranes, grown by microwave plasma chemical vapor deposition (MPCVD), was measured using the membrane deflection experiment developed by Espinosa and co-workers. The data show that fracture strength follows a Weibull distribution. Furthermore, we show that the Weibull parameters are highly dependent on the seeding process used in the growth of the films. When seeding was performed with microsized diamond particles, using mechanical polishing, the stress resulting in a probability of failure of 63% was found to be 1.74 GPa, and the Weibull modulus was 5.74. By contrast, when seeding was performed with nanosized diamond particles, using ultrasonic agitation, the stress resulting in a probability of failure of 63%, increased to 4.13 GPa, and the Weibull modulus was 10.76. The tests also provided the elastic modulus of UNCD, which was found to vary from 940 to 970 GPa for both micro- and nanoseeding. The investigation highlights the role of microfabrication defects on material properties and reliability, as a function of seeding technique, when identical MPCVD chemistry is employed. The parameters identified in this study are expected to aid the designer of microelectromechanical systems devices employing UNCD films.
Yi, Xiang; Yao, Mingwu
2015-02-01
In this paper, we present analytical expressions for the performance of urban free-space optical (FSO) communication systems under the combined influence of atmospheric turbulence- and misalignment-induced fading (pointing errors). The atmospheric turbulence channel is modeled by the exponentiated Weibull (EW) distribution that can accurately describe the probability density function (PDF) of the irradiance fluctuations associated with a transmitted Gaussian-beam wave and a finite-sized receiving aperture. The nonzero boresight pointing error PDF model, which is recently proposed for considering the effects of both boresight and jitter, is adopted in analysis. We derive a novel expression for the composite PDF in terms of a convergent double series involving a Meijer's G-function. Based on the statistical results mentioned above, exact expressions for the average bit error rate of on-off keying modulation scheme and the outage probability are developed. To provide more insight, we also perform an asymptotic error rate analysis at high average signal-to-noise ratio. Our analytical results indicate that the diversity gain for the zero boresight case is determined only by the ratio between the equivalent beamwidth at the receiver and the jitter standard deviation, while for the nonzero boresight case, the diversity gain is related to the ratio of the equivalent beamwidth to the jitter variance as well as the parameter of the EW distribution. PMID:25836152
Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1998-01-01
Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.
Strength analysis of yttria-stabilized tetragonal zirconia polycrystals
Noguchi, K.; Matsuda, Y.; Oishi, M. (Toray Research Center, Inc., Otsu, Shiga 520 (JP)); Masaki, T.; Nakayama, S.; Mizushina, M. (Toray Industries, Inc., Otsu, Shiga 520 (JP))
1990-09-01
This paper reports the tensile strength of Y{sub 2}O{sub 3}-stabilized ZrO{sub 2} polycrystals (Y-TZP) measured by a newly developed tensile testing method with a rectangular bar. The tensile strength of Y-TZP was lower than that of the three-point bend strength, and the shape of the tensile strength distribution was quite different from that of the three-point bend strength distribution. It was difficult to predict the distribution curve of the tensile strength using the data of the three-point bend strength by one-modal Weibull distribution. The distribution of the tensile strength was analyzed by two- or three-modal Weibull distribution coupled with an analysis of fracture origins. The distribution curve of the three-point bend strength which was estimated by multimodal Weibull distribution agreed favorably with that of the measured three-point bend strength values. A two-modal Weibull distribution function was formulated approximately from the distributions of the tensile and three-point bend strengths, and the estimated two-modal Weibull distribution function for the four-point bend strength agreed well with the measured four-point bend strength.
Hirose, H
1997-01-01
This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed. PMID:9384621
Mixture distributions of wind speed in the UAE
NASA Astrophysics Data System (ADS)
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.
Effect of Individual Component Life Distribution on Engine Life Prediction
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.
2003-01-01
The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.
The ATLAS distributed analysis system
NASA Astrophysics Data System (ADS)
Legger, F.; Atlas Collaboration
2014-06-01
In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.
Fitting the empirical distribution of intertrade durations
NASA Astrophysics Data System (ADS)
Politi, Mauro; Scalas, Enrico
2008-03-01
Based on the analysis of a tick-by-tick data set used in the previous work by one of the authors (DJIA stocks traded at NYSE in October 1999), in this paper, we reject the hypothesis that tails of the empirical intertrade distribution are described by a power law. We further argue that the Tsallis q-exponentials are a viable tool for fitting and describing the unconditional distribution of empirical intertrade durations and they compare well to the Weibull distribution.
Additive noise, Weibull functions and the approximation of psychometric functions.
Mortensen, U
2002-09-01
The Weibull function is frequently chosen to define psychometric functions. Tyler and Chen (Vis. Res. 40 (2000) 3121) criticised the high-threshold postulate implied by the Weibull function and argued that this function implies the assumption of multiplicative noise. It will be shown in this paper that in fact the Weibull function is compatible with the assumption of additive noise, and that the Weibull function may be generalised to the case of detection not being high threshold. The derivations rest, however, on a representation of sensory activity lacking a satisfying degree of generality. Therefore, a more general representation of sensory activity in terms of stochastic processes will be suggested, with detection being defined as a level-crossing process, containing the original representation as a special case. Two classes of stochastic processes will be considered: one where the noise is assumed to be additive, stationary Gaussian, and another resulting from cascaded Poisson processes, representing a form of multiplicative noise. While Weibull functions turn out to approximate well psychometric functions generated by both types of stochastic processes, it also becomes obvious that there is no simple interpretation of the parameters of the fitted Weibull functions. Moreover, corresponding to Tyler and Chen's discussion of the role of multiplicative noise particular sources of this type of noise will be considered and shown to be compatible with the Weibull. It is indicated how multiplicative noise may be defined in general; however, it will be argued that in the light of certain empirical data the role of this type of noise may be negligible in most detection tasks. PMID:12350425
Distributed data analysis in LHCb
NASA Astrophysics Data System (ADS)
Paterson, S. K.; Maier, A.
2008-07-01
The LHCb distributed data analysis system consists of the Ganga job submission front-end and the DIRAC Workload and Data Management System (WMS). Ganga is jointly developed with ATLAS and allows LHCb users to submit jobs on several backends including: several batch systems, LCG and DIRAC. The DIRAC API provides a transparent and secure way for users to run jobs to the Grid and is the default mode of submission for the LHCb Virtual Organisation (VO). This is exploited by Ganga to perform distributed user analysis for LHCb. This system provides LHCb with a consistent, efficient and simple user experience in a variety of heterogeneous environments and facilitates the incremental development of user analysis from local test jobs to the Worldwide LHC Computing Grid. With a steadily increasing number of users, the LHCb distributed analysis system has been tuned and enhanced over the past two years. This paper will describe the recent developments to support distributed data analysis for the LHCb experiment on WLCG.
Moment series for moment estimators of the parameters of a Weibull density
Bowman, K.O.; Shenton, L.R.
1982-01-01
Taylor series for the first four moments of the coefficients of variation in sampling from a 2-parameter Weibull density are given: they are taken as far as the coefficient of n/sup -24/. From these a four moment approximating distribution is set up using summatory techniques on the series. The shape parameter is treated in a similar way, but here the moment equations are no longer explicit estimators, and terms only as far as those in n/sup -12/ are given. The validity of assessed moments and percentiles of the approximating distributions is studied. Consideration is also given to properties of the moment estimator for 1/c.
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
Sugar Cane Nutrient Distribution Analysis
NASA Astrophysics Data System (ADS)
Zamboni, C. B.; da Silveira, M. A. G.; Gennari, R. F.; Garcia, I.; Medina, N. H.
2011-08-01
Neutron Activation Analysis (NAA), Molecular Absorption Spectrometry (UV-Vis), and Flame Photometry techniques were applied to measure plant nutrient concentrations of Br, Ca, Cl, K, Mn, N, Na and P in sugar-cane root, stalk and leaves. These data will be used to explore the behavior of element concentration in different parts of the sugar-cane to better understand the plant nutrient distribution during its development.
Distribution-free discriminant analysis
Burr, T.; Doak, J.
1997-05-01
This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.
Sampling times influence the estimate of parameters in the Weibull dissolution model.
Cupera, Jakub; Lansky, Petr; Sklubalova, Zdenka
2015-10-12
The aim is to determine how well the parameters of the Weibull model of dissolution can be estimated in dependency on the chosen times to measure the empirical data. The approach is based on the theory of Fisher information. We show that in order to obtain the best estimates the data should be collected at time instants when tablets actively dissolve or at their close proximity. This is in a sharp contrast with commonly used experimental protocols when sampling times are distributed rather uniformly. PMID:26215461
Spatial and Temporal Patterns of Global Onshore Wind Speed Distribution
Zhou, Yuyu; Smith, Steven J.
2013-09-09
Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/CFSR reanalysis data. The estimated Weibull distribution performs well in fitting the time series wind speed data at the global level according to R2, root mean square error, and power density error. The spatial, decadal, and seasonal patterns of wind speed distribution were then evaluated. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in substantial errors. While large-scale wind speed data is often presented in the form of average wind speeds, these results highlight the need to also provide information on the wind speed distribution.
A general Bayes weibull inference model for accelerated life testing
van Dorp, Johan René
1 A general Bayes weibull inference model for accelerated life testing J. René Van Dorp, Thomas A, Washington D.C. 20052 USA Abstract This article presents the development of a general Bayes inference model for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes
Probabilistic Weibull behavior and mechanical properties of MEMS brittle materials
O. M. Jadaan; N. N. Nemeth; J. Bagdahn; W. N. Sharpe
2003-01-01
The objective of this work is to present a brief overview of a probabilistic design methodology for brittle structures, review the literature for evidence of probabilistic behavior in the mechanical properties of MEMS (especially strength), and to investigate whether evidence exists that a probabilistic Weibull effect exists at the structural microscale. Since many MEMS devices are fabricated from brittle materials,
Application of Weibull Criterion to failure prediction in compsites
Cain, W. D.; Knight, Jr., C. E.
1981-04-20
Fiber-reinforced composite materials are being widely used in engineered structures. This report examines how the general form of the Weibull Criterion, including the evaluation of the parameters and the scaling of the parameter values, can be used for the prediction of component failure.
Estimation of P[Y < X] for Weibull Distribution
Kundu, Debasis
Department of Mathematics and Statistics, Indian Institute of Technology Kanpur, Pin 208016, India. 2 of R is very common in the statistical literature. For example, if X is the strength of a system which be mentioned that related problems have been widely used in the statistical literature. The maximum likelihood
Statistical analysis of the mechanical properties of composite materials
E Barbero; J Fernández-Sáez; C Navarro
2000-01-01
The Weibull statistic is currently used in designing mechanical components made of composite materials. This work presents useful formulae to describe the behaviour of the Weibull modulus estimator, which in turn may be described by means of a three-parameter Weibull distribution. Expressions for the parameters of this latter distribution, dependent on the sample size, are also given in the paper,
CRAB: A CMS application for distributed analysis
G. Codispoti; M. Cinquilli; A. Fanfani; F Fanzago; F. Farina; C. Kavka; S. Lacaprara; V. Miccio; D. Spiga; E. Vaandering
2008-01-01
Starting from 2008, the CMS experiment will produce several Pbytes of data every year, to be distributed over many computing centers geographically distributed in different countries. The CMS computing model defines how the data has to be distributed and accessed in order to enable physicists to run efficiently their analysis over the data. The analysis will be thus performed in
NASA Astrophysics Data System (ADS)
Lu, Chunsheng
2008-05-01
In a recent letter, Barber, Andrews, Schadler, and Wagner, Appl. Phys. Lett. 87, 203106 (2005). indicated that Weibull-Poisson statistics could accurately model the nanotube tensile strength data, and then concluded that the apparent strengthening mechanism in a multiwalled carbon nanotube (MWCNT) grown by chemical vapor deposition (CVD) is most likely caused by an enhanced interaction between the walls of the nanotube. In this comment, we show that their conclusion seems to be inconsistent with the assumption introduced in the data analysis by using a two-parameter Weibull distribution. Further statistical analysis provides a new explanation on the scattered strengths of MWCNTs. The effectiveness of Weibull-Poisson statistics at nanoscales is also discussed.
Weibull Effective Area for Hertzian Ring Crack Initiation Stress
Jadaan, Osama M.; Wereszczak, Andrew A; Johanns, Kurt E
2011-01-01
Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.
Distributed X-Ray Data and Distributed Analysis Tools
NASA Astrophysics Data System (ADS)
Plummer, D.; Schachter, J.; Elvis, M.; Garcia, M.; Conroy, M.
X-ray data and analysis packages can now be distributed directly to the desks of the user community using CDROMs and portable, familiar analysis packages (like IRAF/PROS). The Einstein IPC Slew Survey is an example of a complete X-ray data set distributed via CDROMs and is the first to use the new FITS standard for photon event lists (BINTABLE). Users can analyze the Slew data directly off the CDROM using PROS. As an example, we present a recipe for producing a radial profile of the Cygnus Loop using PROS and the Slew Survey CDROM data. CDROMs of the complete Einstein IPC and HRI archive data sets will soon be distributed in BINTABLE format allowing similar analysis with those data.
Distributed computing and nuclear reactor analysis
Brown, F.B.; Derstine, K.L.; Blomquist, R.N.
1994-03-01
Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.
A Weibull model to characterize lifetimes of aluminum alloy electrical wire connections
C. F. Joyce
1991-01-01
Samples of 12-gauge aluminum alloy electrical conductor wire and three different connector torques were tested to 1000 cycles using an accelerated test method which is independent of commercially available connectors. A Weibull probability model was fitted to the number of cycles to failure, for each of the torques for each alloy. For all groups, ? (Weibull shape parameter, describing the
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2008-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
NSDL National Science Digital Library
Siegrist, Kyle
This online, interactive lesson on special distributions provides examples, exercises, and applets covering normal, gamma, chi-square, student t, F, bivariate, normal, multivariate normal, beta, weibull, zeta, pareto, logistic, lognormal, and extreme value distributions. Overall, this lesson covers a plethora of topics, and for this reason, is a valuable resource.
The Weibull functional form for SEP event spectra
NASA Astrophysics Data System (ADS)
Laurenza, M.; Consolini, G.; Storini, M.; Damiani, A.
2015-08-01
The evolution of the kinetic energy spectra of two Solar Energetic Particle (SEP) events has been investigated through the Shannon's differential entropy during the different phases of the selected events, as proposed by [1]. Data from LET and HET instruments onboard the STEREO spacecraft were used to cover a wide energy range from ? 4 MeV to 100 MeV, as well as EPAM and ERNE data, on board the ACE and SOHO spacecraft, respectively, in the range 1.6 – 112 MeV. The spectral features were found to be consistent with the Weibull like shape, both during the main phase of the SEP events and over their whole duration. Comparison of results obtained for energetic particles accelerated at corotating interaction regions (CIRs) and transient-related interplanetary shocks are presented in the framework of shock acceleration.
Parametric probability distributions in reliability
Coolen, Frank
, Gamma distribution, Normal distribution, Poisson distribution, Weibull distribution. 1 Introduction be represented by an increasing hazard rate. In Section 2 we briefly consider the Normal distribution, and some distributions The Normal distribution (also known as `Gaussian distribution') is arguably the most important
Shuttle Electrical Power Analysis Program (SEPAP) distribution circuit analysis report
NASA Technical Reports Server (NTRS)
Torina, E. M.
1975-01-01
An analysis and evaluation was made of the operating parameters of the shuttle electrical power distribution circuit under load conditions encountered during a normal Sortie 2 Mission with emphasis on main periods of liftoff and landing.
DIAL: Distributed Interactive Analysis of Large Datasets
D. L. Adams
2003-05-08
DIAL will enable users to analyze very large, event-based datasets using an application that is natural to the data format. Both the dataset and the processing may be distributed over a farm, a site (collection of farms) or a grid (collection of sites). Here we describe the goals of the project, the current design and implementation, and plans for future development. DIAL is being developed within PPDG to understand the requirements that interactive analysis places on the grid and within ATLAS to enable distributed interactive analysis of event data.
Towards Distributed Memory Parallel Program Analysis
Quinlan, D; Barany, G; Panas, T
2008-06-17
This paper presents a parallel attribute evaluation for distributed memory parallel computer architectures where previously only shared memory parallel support for this technique has been developed. Attribute evaluation is a part of how attribute grammars are used for program analysis within modern compilers. Within this work, we have extended ROSE, a open compiler infrastructure, with a distributed memory parallel attribute evaluation mechanism to support user defined global program analysis required for some forms of security analysis which can not be addressed by a file by file view of large scale applications. As a result, user defined security analyses may now run in parallel without the user having to specify the way data is communicated between processors. The automation of communication enables an extensible open-source parallel program analysis infrastructure.
Fracture strength of ultrananocrystalline diamond thin films--identification of Weibull parameters
Espinosa, Horacio D.
Fracture strength of ultrananocrystalline diamond thin films--identification of Weibull parameters diamond UNCD has been investigated using tensile testing of freestanding submicron films. Specifically, the fracture strength of UNCD membranes, grown by microwave plasma chemical vapor deposition MPCVD
Offshore wind resource assessment with Standard Wind Analysis Tool (SWAT): A Rhode Island case study
NASA Astrophysics Data System (ADS)
Crosby, Alexander Robert
Motivated by the current Rhode Island Ocean SAMP (Special Area Management Plan) project and the growing need in the foreseeable future, analysis tools for wind resource assessment are assembled into a toolkit that can be accessed from a GIS. The analysis is demonstrated by application to the ongoing wind resource assessment of Rhode Island's offshore waters by the Ocean SAMP. The tool is called Standard Wind Analysis Tool (SWAT). SWAT utilizes a method for integrating observations from the study area or numerical model outputs to assemble the spatial distribution of the offshore wind resource. Available power is inferred from direct measurements of wind speed, but the shape of the atmospheric boundary layer or wind speed profile must be parameterized in order to extrapolate measurements to heights other than that of the measurements. The vertical wind speed profile is modeled with the basic power law assuming a 1/7 exponent parameter representing near-neutral or more accurately timeaverage conditions. As an alternate estimate from year long multi-level observations at a meteorological tower is employed. The basis for the power analysis is the 2- parameter Weibull probability distribution, recognized as standard in modeling typical wind speed distributions. A Monte-Carlo simulation of the Weibull probability density function provides the expected power densities at observation sites. Application to Rhode Island's coastal waters yields an estimated Weibull shape parameter of roughly 2 for the offshore environment and a Weibull scale parameter that increases with distance from the coast. Estimates of power in the SAMP study area range from 525 to 850 W/m² at an elevation of 80 meters based on an observed profile in the SAMP study area. Like the Weibull scale parameter, annual mean wind power increases with distance offshore.
Analysis of SAW distributed feedback resonators
NASA Astrophysics Data System (ADS)
Vandewege, J.; Lagasse, P. E.
1981-01-01
The main characteristics and advantages of the surface acoustic wave (SAW) distributed feedback resonator are discussed. A coupled mode analysis provides physical insight and simple formulas for the resonant frequency, the quality factor, and the input impedance. Those results are verified by means of a transmission line computer model and by a number of measurements in the frequency range 30-250 MHz. On YZ LiNbO3 substrates, quality factors of the order of 5,000 are routinely obtained.
Assessing a Tornado Climatology from Global Tornado Intensity Distributions
Bernold Feuerstein; Nikolai Dotzek; Jürgen Grieser
2005-01-01
Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if
Force distribution analysis of mechanochemically reactive dimethylcyclobutene.
Li, Wenjin; Edwards, Scott A; Lu, Lanyuan; Kubar, Tomas; Patil, Sandeep P; Grubmüller, Helmut; Groenhof, Gerrit; Gräter, Frauke
2013-08-26
Internal molecular forces can guide chemical reactions, yet are not straightforwardly accessible within a quantum mechanical description of the reacting molecules. Here, we present a force-matching force distribution analysis (FM-FDA) to analyze internal forces in molecules. We simulated the ring opening of trans-3,4-dimethylcyclobutene (tDCB) with on-the-fly semiempirical molecular dynamics. The self-consistent density functional tight binding (SCC-DFTB) method accurately described the force-dependent ring-opening kinetics of tDCB, showing quantitative agreement with both experimental and computational data at higher levels. Mechanical force was applied in two different ways, namely, externally by a constant pulling force and internally by embedding tDCB within a strained macrocycle-containing stiff stilbene. We analyzed the distribution of tDCB internal forces in the two different cases by FM-FDA and found that external force gave rise to a symmetric force distribution in the cyclobutene ring, which also scaled linearly with the external force, indicating that the force distribution was uniquely determined by the symmetric architecture of tDCB. In contrast, internal forces due to stiff stilbene resulted in an asymmetric force distribution within tDCB, which indicated a different geometry of force application and supported the important role of linkers in the mechanochemical reactivity of tDCB. In addition, three coordinates were identified through which the distributed forces contributed most to rate acceleration. These coordinates are mostly parallel to the coordinate connecting the two CH3 termini of tDCB. Our results confirm previous observations that the linker outside of the reactive moiety, such as a stretched polymer or a macrocycle, affects its mechanochemical reactivity. We expect FM-FDA to be of wide use to understand and quantitatively predict mechanochemical reactivity, including the challenging cases of systems within strained macrocycles. PMID:23843171
Performance optimisations for distributed analysis in ALICE
NASA Astrophysics Data System (ADS)
Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.
2014-06-01
Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.
Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.
Distributed resource analysis capabilities of the distribution engineering workstation
Broadwater, R.; Thompson, J.; Ng, H.
1995-12-01
The Electric Power Research Institute Distribution Engineering Workstation, DEWorkstation, is based upon a comprehensive distribution system data schema coupled with an open-architecture database and software design. The data schema models aspects of distributed resources. DEWorkstation is supplied with a basic set of applications. These applications model features of distributed resources. Using the application programmer interface, additional applications may be added and/or custom studies may be performed. Several utilities are currently interfacing new application programs.
Distribution System Analysis Tools for Studying High Penetration of PV
Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV project titled "Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support
Introduction to Network Analysis 15 Generating Functions and Degree Distributions
Safro, Ilya
Introduction to Network Analysis 15 Generating Functions and Degree Distributions #12;Introduction Analysis 20 More properties of random model Excess degree distribution is the probability distribution coefficient for configuration model v i j #12;Introduction to Network Analysis 21 Generating Functions
Distribution System Analysis to support the Smart Grid
R. C. Dugan; R. F. Arritt; T. E. McDermott; S. M. Brahma; K. Schneider
2010-01-01
The “Smart Grid” refers to various efforts to modernize the power grid through the application of intelligent devices. This paper describes current thinking by members of the Distribution System Analysis Subcommittee (DSA SC) on how distribution system analysis might evolve to support the Smart Grid. Various issues related to Smart Grid and distribution system analysis are identified. The essential characteristics
NSDL National Science Digital Library
Anderson-Cook, C.
This page, created by Virginia Tech's Department of Statistics, provides users with a myriad of different applets presenting different probability distributions. Some of these include: binomial, Poisson, negative binomial, geometric, t, chi-squared, gamma, Weibull, log-normal, beta and f distributions. Each applet is presented with an explanation of its use and also an example. Featuring eleven different applets, this is a useful and plentiful resource.
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
Psychotherapy and distributive justice: a Rawlsian analysis.
Wilmot, Stephen
2009-03-01
In this paper I outline an approach to the distribution of resources between psychotherapy modalities in the context of the UK's health care system, using recent discussions of Cognitive Behavioural Psychotherapy as a way of highlighting resourcing issues. My main goal is to offer an approach that is just, and that accommodates the diversity of different schools of psychotherapy. In order to do this I draw extensively on the theories of Justice and of Political Liberalism developed by the late John Rawls, and adapt these to the particular requirements of psychotherapy resourcing. I explore some of the implications of this particular analysis, and consider how the principles of Rawlsian justice might translate into ground rules for deliberation and decision-making. PMID:18536980
Shadow Imaging for Charge Distribution Analysis
NASA Astrophysics Data System (ADS)
Zhu, Yimei; Wu, Lijun
We briefly review the shadow imaging method for charge distribution analysis developed at Brookhaven. It is a unique electron-diffraction technique. Instead of focusing a small electron probe on the sample in conventional convergent beam electron diffraction, we focus the probe above (or below) the sample, resulting in parallel recording of dark-field images (shadow images), or PARODI. Because the method couples diffraction with imaging, it is thus suitable for studying crystals as well as their defects. We used this technique to accurately describe charge transfer that determines the functionality of technologically important materials. Examples are given for MgB2 superconductor and CaCu3Ti4O12 oxide that exhibits giant dielectric response. Discussions on non-spherical electron scattering factors and their parameterizations for direct observations of electron orbitals in atomic images are also included.
Buffered Communication Analysis in Distributed Multiparty Sessions
NASA Astrophysics Data System (ADS)
Deniélou, Pierre-Malo; Yoshida, Nobuko
Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.
ATLAS Distributed Data Analysis: challenges and performance
Fassi, Farida; The ATLAS collaboration
2015-01-01
In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...
Toward Autoniatirig Analysis Support for Developers of Distributed Software*
Avrunin, George S.
Hardware systems supporting distributed computing are constantly improving. Our ability to create softwareToward Autoniatirig Analysis Support for Developers of Distributed Software* Jack C . Wileden Software Development Laboratory Computer and Information Science Department University of Massachusetts
Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India
NASA Astrophysics Data System (ADS)
Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.
2014-09-01
The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.
Statistical Analysis of Particle Distributions in Composite Materials
Wichert, Sofia
Statistical Analysis of Particle Distributions in Composite Materials by Sofia Mucharreira de Department of Probability and Statistics Submitted: December 2000 #12;ii Statistical Analysis of Particle be affected by the spatial distribution of the particles. Consequently, a statistical analysis of particle
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
NASA Astrophysics Data System (ADS)
Sazuka, Naoya
2007-03-01
We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.
Analysis of measured airfoil pressure distributions
NASA Technical Reports Server (NTRS)
Piziali, R. A.
1975-01-01
A method for evaluating the Glauert coefficients from airfoil pressure distributions is investigated. The linear operating range of the airfoils in steady-state and periodic operating conditions are considered. A rational method for quantitatively characterizing airfoil pressure distributions relative to their geometry and aerodynamic operating environment is developed. The characteristics of the airfoil operating environment is determined from its measured pressure distribution.
Acceleration of sensitivity analysis by distributed computation.
Sarai, N
2007-01-01
We have been developing a comprehensive mathematical model of cardiac myocyte based on molecular functions using a biological simulator, simBio. In this approach, expanding computing power is needed as the model becomes more intricate. Sensitivity analyses, which provide dependency of the model behavior on specific parameters, are also inevitable for developing and utilize models. Meanwhile, distributed computation by a cluster of personal computers (PC) became available using an open source package. A free software package named Jini orchestrates computers on a network, which we coupled with simBio. We connected 11 PCs and found that the time required for computing 504 models became 13 times shorter than that with a single PC. This method was proved efficient for sensitivity analysis, because calculations are independent and a linear decrease of computation time was obtained by adding PCs to the cluster. The visualization feature gives a researcher an instant feed-back, thus this system accelerates model driven study. The whole system with source code is available at www.sim-bio.org. PMID:18002164
Runs of geometrically distributed random variables: a probabilistic analysis
Louchard, Guy
Runs of geometrically distributed random variables: a probabilistic analysis Guy Louchard #3; March asymptotic properties of runs of geometrically distributed random variables. We analyze the limiting trajec- tories, the number of runs and the run length distribution, the hitting time to a length k run
Comparison of methods for analysis of selective genotyping survival data
McElroy, Joseph P; Zhang, Wuyan; Koehler, Kenneth J; Lamont, Susan J; Dekkers, Jack CM
2006-01-01
Survival traits and selective genotyping datasets are typically not normally distributed, thus common models used to identify QTL may not be statistically appropriate for their analysis. The objective of the present study was to compare models for identification of QTL associated with survival traits, in particular when combined with selective genotyping. Data were simulated to model the survival distribution of a population of chickens challenged with Marek disease virus. Cox proportional hazards (CPH), linear regression (LR), and Weibull models were compared for their appropriateness to analyze the data, ability to identify associations of marker alleles with survival, and estimation of effects when all individuals were genotyped (full genotyping) and when selective genotyping was used. Little difference in power was found between the CPH and the LR model for low censoring cases for both full and selective genotyping. The simulated data were not transformed to follow a Weibull distribution and, as a result, the Weibull model generally resulted in less power than the other two models and overestimated effects. Effect estimates from LR and CPH were unbiased when all individuals were genotyped, but overestimated when selective genotyping was used. Thus, LR is preferred for analyzing survival data when the amount of censoring is low because of ease of implementation and interpretation. Including phenotypic data of non-genotyped individuals in selective genotyping analysis increased power, but resulted in LR having an inflated false positive rate, and therefore the CPH model is preferred for this scenario, although transformation of the data may also make the Weibull model appropriate for this case. The results from the research presented herein are directly applicable to interval mapping analyses. PMID:17129564
Dynamic Analysis of the Arrow Distributed Protocol Fabian Kuhn
Dynamic Analysis of the Arrow Distributed Protocol Fabian Kuhn Department of Computer Science ETH Zurich 8092 Zurich, Switzerland wattenhofer@inf.ethz.ch ABSTRACT Arrow is a prominent distributed present a dynamic analysis of the Ar- row protocol. We prove that Arrow is O(log D)-competitive, where D
Shark: Fast Data Analysis Using Coarse-grained Distributed Memory
California at Irvine, University of
Shark: Fast Data Analysis Using Coarse-grained Distributed Memory Cliff Engle, Antonio Lupher {cengle, alupher, rxin, matei, franklin, shenker, istoica}@cs.berkeley.edu ABSTRACT Shark is a research data analysis system built on a novel coarse-grained distributed shared-memory abstraction. Shark
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
Analysis of Temperature Distributions in Nighttime Inversions
NASA Astrophysics Data System (ADS)
Telyak, Oksana; Krasouski, Aliaksandr; Svetashev, Alexander; Turishev, Leonid; Barodka, Siarhei
2015-04-01
Adequate prediction of temperature inversion in the atmospheric boundary layer is one of prerequisites for successful forecasting of meteorological parameters and severe weather events. Examples include surface air temperature and precipitation forecasting as well as prediction of fog, frosts and smog with hazardous levels of atmospheric pollution. At the same time, reliable forecasting of temperature inversions remains an unsolved problem. For prediction of nighttime inversions over some specific territory, it is important to study characteristic features of local circulation cells formation and to properly take local factors into account to develop custom modeling techniques for operational use. The present study aims to investigate and analyze vertical temperature distributions in tropospheric inversions (isotherms) over the territory of Belarus. We study several specific cases of formation, evolution and decay of deep nighttime temperature inversions in Belarus by means of mesoscale numerical simulations with WRF model, considering basic mechanisms of isothermal and inverse temperature layers formation in the troposphere and impact of these layers on local circulation cells. Our primary goal is to assess the feasibility of advance prediction of inversions formation with WRF. Modeling results reveal that all cases under consideration have characteristic features of radiative inversions (e.g., their formation times, development phases, inversion intensities, etc). Regions of "blocking" layers formation are extensive and often spread over the entire territory of Belarus. Inversions decay starts from the lowermost (near surface) layer (altitudes of 5 to 50 m). In all cases, one can observe formation of temperature gradients that substantially differ from the basic inversion gradient, i.e. the layer splits into smaller layers, each having a different temperature stratification (isothermal, adiabatic, etc). As opposed to various empirical techniques as well as theoretical approaches based on discriminant analysis, mesoscale modeling with WRF provides fairly successful forecasts of formation times and regions for all types of temperature inversions up to 3 days in advance. Furthermore, we conclude that without proper adjustment for the presence of thin isothermal layers (adiabatic and/or inversion layers), temperature data can affect results of statistical climate studies. Provided there are regions where a long-term, constant inversion is present (e.g., Antarctica or regions with continental climate), these data can contribute an uncompensated systematic error of 2 to 10° C. We argue that this very fact may lead to inconsistencies in long-term temperature data interpretations (e.g., conclusions ranging from "global warming" to "global cooling" based on temperature observations for the same region and time period). Due to the importance of this problem from the scientific as well as practical point of view, our plans for further studies include analysis of autumn and wintertime inversions and convective inversions. At the same time, it seems promising to develop an algorithm of automatic recognition of temperature inversions based on a combination of WRF modeling results, surface and satellite observations.
Building Finite Element Analysis Programs in Distributed Services Environment
Stanford University
1. Building Finite Element Analysis Programs in Distributed Services Environment Jun Peng1 and Kincho H. Law2 Abstract Traditional finite element analysis (FEA) programs are typically built on a dedicated computer using the developments offered by a finite element analysis (FEA) program. Typically
DISTRIBUTION SYSTEM RELIABILITY ANALYSIS USING A MICROCOMPUTER
Distribution system reliability for most utilities is maintained by the knowledge of a few key personnel. Generally, these water maintenance personnel use a good memory, repair records, a large wall map and a hydraulic model of the larger transmission mains to help identify probl...
Cosmological Analysis of the Satellite Galaxy Distribution
NASA Astrophysics Data System (ADS)
Gómez-Flechoso, M. A.; Benjouali, L.; Tenreiro, R. Domínguez
There exists galaxy systems in the Local Universe formed by satellite dwarf galaxies orbiting a main one. The knowledge of the satellite distribution and their characteristics gives information about the formation and assembly processes of the galaxies in the Universe. In this paper, we analyze the satellite distribution around disk galaxies in cosmological hydrodynamical simulations, both in three-dimensions (3D) and in projection along random directions, mimicking observational strategies. It has been found that, at short 3D distances, the satellite orbits in rich systems (that is, systems with high number of satellites) have on average a polar distribution. The orbital distribution in projection at short distances between the satellite and its host shows a lack (excess) of minor-axis alignments for poor (rich) systems. Therefore, the alignments onto a virtual sky would appear mostly isotropic (i.e. no-preferred major or minor axis alignments), or even planar (i.e. major-axis alignments), depending the selected sample, in consistency with most observational analyses.
WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT
The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...
A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods
ERIC Educational Resources Information Center
Ritter, Nicola L.
2012-01-01
Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…
Economic analysis of efficient distribution transformer trends
Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.
1998-03-01
This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.
DRL interresponse-time distributions: quantification by peak deviation analysis.
Richards, J B; Sabol, K E; Seiden, L S
1993-01-01
Peak deviation analysis is a quantitative technique for characterizing interresponse-time distributions that result from training on differential-reinforcement-of-low-rate schedules of reinforcement. It compares each rat's obtained interresponse-time distribution to the corresponding negative exponential distribution that would have occurred if the rat had emitted the same number of responses randomly in time, at the same rate. The comparison of the obtained distributions with corresponding negative exponential distributions provides the basis for computing three standardized metrics (burst ratio, peak location, and peak area) that quantitatively characterize the profile of the obtained interresponse-time distributions. In Experiment 1 peak deviation analysis quantitatively described the difference between the interresponse-time distributions of rats trained on variable-interval 300-s and differential-reinforcement-of-low-rate 72-s schedules of reinforcement. In Experiment 2 peak deviation analysis differentiated between the effects of the psychomotor stimulant d-amphetamine, the anxiolytic compound chlordiazepoxide, and the antidepressant desipramine. The results suggest that peak deviation analysis of interresponse-time distributions may provide a useful behavioral assay system for characterizing the effects of drugs. PMID:8409824
Design and Performance Analysis of a Distributed Java Virtual Machine
Moldovan, Dan I.
responsible for the large popularity of the Java language. Complex projects, on the other hand, often requireDesign and Performance Analysis of a Distributed Java Virtual Machine Mihai Surdeanu, Member, IEEE, and Dan Moldovan, Senior Member, IEEE AbstractÐThis paper introduces DISK, a distributed Java Virtual
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.
Performance analysis of static locking in replicated distributed database systems
NASA Technical Reports Server (NTRS)
Kuang, Yinghong; Mukkamala, Ravi
1991-01-01
Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.
Credibility in Context: An Analysis of Feature Distributions in Twitter
California at Santa Barbara, University of
Credibility in Context: An Analysis of Feature Distributions in Twitter John O'Donovan, Byungkyu@cs.rpi.edu Abstract--Twitter is a major forum for rapid dissemination of user-provided content in real time. As such focus on an analysis that highlights the utility of the individual features in Twitter such as hashtags
Complex network analysis of water distribution systems
A. Yazdani; P. Jeffrey
2011-04-01
This paper explores a variety of strategies for understanding the formation, structure, efficiency and vulnerability of water distribution networks. Water supply systems are studied as spatially organized networks for which the practical applications of abstract evaluation methods are critically evaluated. Empirical data from benchmark networks are used to study the interplay between network structure and operational efficiency, reliability and robustness. Structural measurements are undertaken to quantify properties such as redundancy and optimal-connectivity, herein proposed as constraints in network design optimization problems. The role of the supply-demand structure towards system efficiency is studied and an assessment of the vulnerability to failures based on the disconnection of nodes from the source(s) is undertaken. The absence of conventional degree-based hubs (observed through uncorrelated non-heterogeneous sparse topologies) prompts an alternative approach to studying structural vulnerability based on the identification of network cut-sets and optimal connectivity invariants. A discussion on the scope, limitations and possible future directions of this research is provided.
M. Meniconi; D. M. Barry
1996-01-01
Statistical distributions have long been employed in the assessment of semiconductor device and product reliability. The use of the exponential distribution which is frequently preferred over mathematically more complex distributions, such as the Weibull and the lognormal among others, suggests that most engineers favour the application of simpler models to obtain failure rates and reliability figures quickly. It is therefore
Distributed transit compartments for arbitrary lifespan distributions in aging populations.
Koch, Gilbert; Schropp, Johannes
2015-09-01
Transit compartment models (TCM) are often used to describe aging populations where every individual has its own lifespan. However, in the TCM approach these lifespans are gamma-distributed which is a serious limitation because often the Weibull or more complex distributions are realistic. Therefore, we extend the TCM concept to approximately describe any lifespan distribution and call this generalized concept distributed transit compartment models (DTCMs). The validity of DTCMs is obtained by convergence investigations. From the mechanistic perspective the transit rates are directly controlled by the lifespan distribution. Further, DTCMs could be used to approximate the convolution of a signal with a probability density function. As example a stimulatory effect of a drug in an aging population with a Weibull-distributed lifespan is presented where distribution and model parameters are estimated based on simulated data. PMID:26100181
First Experiences with LHC Grid Computing and Distributed Analysis
Fisk, Ian
2010-12-01
In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.
Entropy Methods For Univariate Distributions in Decision Analysis
NASA Astrophysics Data System (ADS)
Abbas, Ali E.
2003-03-01
One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.
ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS
Tuffner, Francis K.; Singh, Ruchi
2011-08-09
Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).
Individual patterns of motor deficits evident in movement distribution analysis
Huang, Felix C.; Patton, James L.
2015-01-01
Recent studies in rehabilitation have shown potential benefits of patient-initiated exploratory practice. Such findings, however, lead to new challenges in how to quantify and interpret movement patterns. We posit that changes in coordination are most evident in statistical distributions of movements. In a test on 10 chronic stroke subjects practicing for 3 days, we found that inter-quartile range of motion did not show improvement. However, a multivariate Gaussians analysis required more complexity at the end of training. Beyond simply characterizing movement, linear discriminant classification of each patient’s movement distribution also identified that each patient’s motor deficit left a unique signature. The greatest distinctions were observed in the space of accelerations (rather than position or velocity). These results suggest that unique deficits are best detected with such a distribution analysis, and also point to the need for customized interventions that consider such patient-specific motor deficits. PMID:24187248
GIS-based poverty and population distribution analysis in China
NASA Astrophysics Data System (ADS)
Cui, Jing; Wang, Yingjie; Yan, Hong
2009-07-01
Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.
Energy loss analysis of an integrated space power distribution system
NASA Technical Reports Server (NTRS)
Kankam, M. David; Ribeiro, P. F.
1992-01-01
The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.
Distributed Clock Synchronization over Wireless Networks: Algorithms and Analysis
Distributed Clock Synchronization over Wireless Networks: Algorithms and Analysis Arvind Giridhar- municate is based on exchange of time-stamped packets. Viewing the network as a graph, this corresponds to finding estimates of clock offsets across the edges of the graph. These estimates must then be processed
Analysis of Policy Anomalies on Distributed Network Security Setups
Garcia-Alfaro, Joaquin
the security rules expressed in this policy over different security components of the information systemAnalysis of Policy Anomalies on Distributed Network Security Setups J. G. Alfaro1,2, F. Cuppens1 detection systems (NIDSs), is the dominant method to survey and guarantee the security policy in current
Systematical Analysis on Angular Distribution of Bremsstrahlung Radiation
Otgooloi, B.; Enkhbat, N.
2009-03-31
The systematic analysis has been made the measurement results of the relative angular distribution of gamma quantium with 11 divide 16 MeV energy using experimental data of Ta, W, Cu, Mo and Ti targets with various radiating lengths thicknesses.
Prototype of distributed analysis software hierarchy for the SUBARU Telescope
NASA Astrophysics Data System (ADS)
Mizumoto, Yoshihiko; Chikada, Yoshihiro; Kosugi, George; Yagi, M.; Nishihara, Eiji; Takata, Tadafumi; Yoshida, Michitoshi; Ishihara, Yasuhide; Yanaka, Hiroshi; Morita, Yasuhiro; Nakamoto, Hiroyuki
1998-07-01
We are developing a data reduction and analysis system DASH for efficient data processing of the SUBARU telescope. We adopted CORBA as a distributed object environment and Java for a user interface in the prototype of DASH. Moreover, we introduced a data reduction procedure cube as a kind of visual procedure script.
Principal Component Analysis for Distributed Data Sets with Updating
Chan, Raymond
Principal Component Analysis for Distributed Data Sets with Updating Zheng-Jian Bai1, , Raymond H data sets is a key requirement in data mining. A powerful technique for this purpose is the principal component analy- sis (PCA). PCA-based clustering algorithms are effective when the data sets are found
Analysis of AOA-TOA signal distribution in indoor environments
E. Tsalolihin; I. Bilik; N. Blaunstein; S. Shakya
2011-01-01
This work presents theoretical and experimental analysis of radio signals propagation in LOS and NLOS indoor propagation conditions. Specifically, propagation conditions in straight corridor and adjacent rooms are analysed. A statistical multi-parametric approach that was previously used in urban propagation conditions is adapted to the modelling of indoor communication channels. The model of the received signal distribution in the joint
Conflict Classification and Analysis of Distributed Firewall Policies
Wang, Yongge
1 Conflict Classification and Analysis of Distributed Firewall Policies Ehab Al-Shaer and Hazem@uwaterloo.ca Masum Hasan Cisco Systems San Jose, California, USA Email: masum@cisco.com Abstract Firewalls are core elements in network security. However, managing firewall rules, particularly in multi-firewall enterprise
Rapid Analysis of Mass Distribution of Radiation Shielding
NASA Technical Reports Server (NTRS)
Zapp, Edward
2007-01-01
Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.
Analysis of exposure biomarker relationships with the Johnson SBB distribution.
Flynn, Michael R
2007-08-01
Application of the Johnson bivariate S(B) distribution, or alternatively the S(BB) distribution, is presented here as a tool for the analysis of concentration data and in particular for characterizing the relationship between exposures and biomarkers. Methods for fitting the marginal S(B) distributions are enhanced by maximizing the Shapiro-Wilk W statistic. The subsequent goodness of fit for the S(BB) distribution is evaluated with a multivariate Z statistic. Median regression results are extended here with methods for calculating the mean and standard deviation of the conditional array distributions. Application of these methods to the evaluation of the relationship between exposure to airborne bromopropane and the biomarker of serum bromide concentration suggests that the S(BB) distribution may be useful in stratifying workers by exposure based on using a biomarker. A comparison with the usual two-parameter log-normal approach shows that in some cases the S(BB) distribution may offer advantages. PMID:17693427
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
Tao, Zhong; Tagare, Hemant D; Beaty, James D
2006-11-01
Segmenting cardiac ultrasound images requires a model for the statistics of speckle in the images. Although the statistics of speckle are well understood for the raw transducer signal, the statistics of speckle in the image are not. This paper evaluates simple empirical models for first-order statistics for the distribution of gray levels in speckle. The models are created by analyzing over 100 images obtained from commercial ultrasound machines in clinical settings. The data in the images suggests a unimodal scalable family of distributions as a plausible model. Four families of distributions (Gamma, Weibull, Normal, and Log-normal) are compared with the data using goodness-of-fit and misclassification tests. Attention is devoted to the analysis of artifacts in images and to the choice of goodness-of-fit and misclassification tests. The distribution of parameters of one of the models is investigated and priors for the distribution are suggested. PMID:17117777
van Dorp, Johan René
A General Bayes Weibull Inference Model for Accelerated Life Testing J. René Van Dorp & Thomas A presents the development of a general Bayes inference model for accelerated life testing. The failure times and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements
Integrating software architectures for distributed simulations and simulation analysis communities.
Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.
2005-10-01
The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.
Spatial Distribution Analysis of Scrub Typhus in Korea
Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob
2013-01-01
Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does not directly affect the incidence rate. Conclusion: GIS analysis shows the spatial characteristics of scrub typhus. This research can be used to construct a spatial-temporal model to understand the epidemic tsutsugamushi. PMID:24159523
Influence Of Lateral Load Distributions On Pushover Analysis Effectiveness
NASA Astrophysics Data System (ADS)
Colajanni, P.; Potenzone, B.
2008-07-01
The effectiveness of two simple load distributions for pushover analysis recently proposed by the authors is investigated through a comparative study, involving static and dynamic analyses of seismic response of eccentrically braced frames. It is shown that in the upper floors only multimodal pushover procedures provide results close to the dynamic profile, while the proposed load patterns are always conservative in the lower floors. They over-estimate the seismic response less than the uniform distribution, representing a reliable alternative to the uniform or more sophisticated adaptive procedures proposed by seismic codes.
Influence Of Lateral Load Distributions On Pushover Analysis Effectiveness
Colajanni, P.; Potenzone, B. [Dipartimento di Ingegneria Civile, Universita di Messina, Contrada Di Dio, S. Agata, 98166 Messina (Italy)
2008-07-08
The effectiveness of two simple load distributions for pushover analysis recently proposed by the authors is investigated through a comparative study, involving static and dynamic analyses of seismic response of eccentrically braced frames. It is shown that in the upper floors only multimodal pushover procedures provide results close to the dynamic profile, while the proposed load patterns are always conservative in the lower floors. They over-estimate the seismic response less than the uniform distribution, representing a reliable alternative to the uniform or more sophisticated adaptive procedures proposed by seismic codes.
Volumetric relief map for intracranial cerebrospinal fluid distribution analysis.
Lebret, Alain; Kenmochi, Yukiko; Hodel, Jérôme; Rahmouni, Alain; Decq, Philippe; Petit, Éric
2015-09-01
Cerebrospinal fluid imaging plays a significant role in the clinical diagnosis of brain disorders, such as hydrocephalus and Alzheimer's disease. While three-dimensional images of cerebrospinal fluid are very detailed, the complex structures they contain can be time-consuming and laborious to interpret. This paper presents a simple technique that represents the intracranial cerebrospinal fluid distribution as a two-dimensional image in such a way that the total fluid volume is preserved. We call this a volumetric relief map, and show its effectiveness in a characterization and analysis of fluid distributions and networks in hydrocephalus patients and healthy adults. PMID:26125975
HammerCloud: A Stress Testing System for Distributed Analysis
van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo
2011-01-01
Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...
Probabilistic Life and Reliability Analysis of Model Gas Turbine Disk
NASA Technical Reports Server (NTRS)
Holland, Frederic A.; Melis, Matthew E.; Zaretsky, Erwin V.
2002-01-01
In 1939, W. Weibull developed what is now commonly known as the "Weibull Distribution Function" primarily to determine the cumulative strength distribution of small sample sizes of elemental fracture specimens. In 1947, G. Lundberg and A. Palmgren, using the Weibull Distribution Function developed a probabilistic lifing protocol for ball and roller bearings. In 1987, E. V. Zaretsky using the Weibull Distribution Function modified the Lundberg and Palmgren approach to life prediction. His method incorporates the results of coupon fatigue testing to compute the life of elemental stress volumes of a complex machine element to predict system life and reliability. This paper examines the Zaretsky method to determine the probabilistic life and reliability of a model gas turbine disk using experimental data from coupon specimens. The predicted results are compared to experimental disk endurance data.
NASA Astrophysics Data System (ADS)
Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars
2013-08-01
There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.
Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis
Maguire, Kelly; Sheriff, Glenn
2011-01-01
Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146
Comparing distributions of environmental outcomes for regulatory environmental justice analysis.
Maguire, Kelly; Sheriff, Glenn
2011-05-01
Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146
Distribution of MSCN coefficients Histogram of gradient of
Evans, Brian L.
Distribution of MSCN coefficients Histogram of gradient of MSCN coefficients and the fitted Rayleigh, Weibull and Nakagami distributions Skewness-kurtosis scatter plot of MSCN coefficients of synthetic images[Kundu2014] and natural images [Martin2001] · Generalized Gaussian distribution fit to MSCN
Automatic analysis of attack data from distributed honeypot network
NASA Astrophysics Data System (ADS)
Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel
2013-05-01
There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.
Karmeshu; Gupta, Varun; Kadambari, K V
2011-06-01
A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter ? reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel. PMID:21701877
Modeling and convergence analysis of distributed coevolutionary algorithms.
Subbu, Raj; Sanderson, Arthur C
2004-04-01
A theoretical foundation is presented for modeling and convergence analysis of a class of distributed coevolutionary algorithms applied to optimization problems in which the variables are partitioned among p nodes. An evolutionary algorithm at each of the p nodes performs a local evolutionary search based on its own set of primary variables, and the secondary variable set at each node is clamped during this phase. An infrequent intercommunication between the nodes updates the secondary variables at each node. The local search and intercommunication phases alternate, resulting in a cooperative search by the p nodes. First, we specify a theoretical basis for a class of centralized evolutionary algorithms in terms of construction and evolution of sampling distributions over the feasible space. Next, this foundation is extended to develop a model for a class of distributed coevolutionary algorithms. Convergence and convergence rate analyzes are pursued for basic classes of objective functions. Our theoretical investigation reveals that for certain unimodal and multimodal objectives, we can expect these algorithms to converge at a geometrical rate. The distributed coevolutionary algorithms are of most interest from the perspective of their performance advantage compared to centralized algorithms, when they execute in a network environment with significant local access and internode communication delays. The relative performance of these algorithms is therefore evaluated in a distributed environment with realistic parameters of network behavior. PMID:15376831
Distribution System Reliability Analysis for Smart Grid Applications
NASA Astrophysics Data System (ADS)
Aljohani, Tawfiq Masad
Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.
Componential distribution analysis of food using near infrared ray image
NASA Astrophysics Data System (ADS)
Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie
2008-11-01
The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.
Scalable Visual Reasoning: Supporting Collaboration through Distributed Analysis
Pike, William A.; May, Richard A.; Baddeley, Bob; Riensche, Roderick M.; Bruce, Joe; Younkin, Katarina
2007-05-21
We present a visualization environment called the Scalable Reasoning System (SRS) that provides a suite of tools for the collection, analysis, and dissemination of reasoning products. This environment is designed to function across multiple platforms, bringing the display of visual information and the capture of reasoning associated with that information to both mobile and desktop clients. The service-oriented architecture of SRS promotes collaboration and interaction between users regardless of their location or platform. Visualization services allow data processing to be centralized and analysis results collected from distributed clients in real time. We use the concept of “reasoning artifacts” to capture the analytic value attached to individual pieces of information and collections thereof, helping to fuse the foraging and sense-making loops in information analysis. Reasoning structures composed of these artifacts can be shared across platforms while maintaining references to the analytic activity (such as interactive visualization) that produced them.
GPS FOM Chimney Analysis using Generalized Extreme Value Distribution
NASA Technical Reports Server (NTRS)
Ott, Rick; Frisbee, Joe; Saha, Kanan
2004-01-01
Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.
Hammer, Julia Eve
Crystal size distribution analysis of plagioclase in experimentally decompressed hydrous rhyodacite November 2010 Editor: T.M. Harrison Keywords: crystal size distribution plagioclase decompression experiments growth rate nucleation rate residence time This study presents crystal size distributions (CSD
EECS 598 Special Topic Analysis of Electric Power Distribution Systems and Loads
Cafarella, Michael J.
EECS 598 Special Topic Analysis of Electric Power Distribution Systems This course covers the fundamentals of electric power distribution systems and electric loads. Most power system courses focus on analysis of transmission systems
Current Distribution Analysis of Insulated Gate Bipolar Transistor Cells
NASA Astrophysics Data System (ADS)
Long, Hongyao; Sweet, Mark R.; Ngwendson, Luther-King; Sankara Narayanan, E. M.
2010-04-01
In current insulated gate bipolar transistor (IGBT) technology, a corner or centered gate pad is employed with polycrystalline silicon (poly-Si) to form the metal oxide semiconductor (MOS) gate structure which forms a resistor-capacitor (RC) network across the die. This paper presents, for the first time, an analysis using circuit simulator, SABER, to analyze its influence on the internal behavior of the IGBT. The difference in the interconnect gate impedance between each cathode cells is found to influence their gate drive voltages, which results in the divergence of collector current within each cathode cells during transient periods. Proper distribution of the poly-Si gate impedance is necessary to achieve uniformity of current distribution in the device.
Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.
Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan
2014-07-29
This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market. PMID:25095276
Growing axons analysis by using Granulometric Size Distribution
NASA Astrophysics Data System (ADS)
Gonzalez, Mariela A.; Ballarin, Virginia L.; Rapacioli, Melina; Celín, A. R.; Sánchez, V.; Flores, V.
2011-09-01
Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.
Time series power flow analysis for distribution connected PV generation.
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger
2013-01-01
Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating potential PV impacts.
Numerical analysis of decoy state quantum key distribution protocols
Harrington, Jim W; Rice, Patrick R
2008-01-01
Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.
E. Chiodo; G. Mazzanti
2006-01-01
Reliability assessment of aged electrical components in the presence of over-stresses (e.g. voltage surges) is not an easy task. In this paper, a new methodology for solving this problem is illustrated, which is based on a Bayesian approach applied to a novel Weibull stress-strength probabilistic model. This model holds, under proper simplifying hypotheses, for electrical components progressively degraded by service
Aalborg Universitet Power Flow Analysis Algorithm for Islanded LV Microgrids Including Distributed
Vasquez, Juan Carlos
Aalborg Universitet Power Flow Analysis Algorithm for Islanded LV Microgrids Including Distributed., & Guerrero, J. M. (2014). Power Flow Analysis Algorithm for Islanded LV Microgrids Including Distributed.aau.dk on: juli 04, 2015 #12;Power Flow Analysis Algorithm for Islanded LV Microgrids Including Distributed
Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters
Lindner, Douglas K.
1 Subsystem Interaction Analysis in Power Distribution Systems of Next Generation Airlifters Sriram power distribution system of a next generation transport aircraft is addressed. Detailed analysis with the analysis of subsystem integration in power distribution systems of next generation transport aircraft
Spatial interpolation of precipitation distributions using copulas
NASA Astrophysics Data System (ADS)
Bàrdossy, Andràs; Pegram, Geoffrey
2013-04-01
The interpolation of precipitation distributions is important for example for climatological studies, interpolation of precipitation on different time scales and extreme value analyses. Spatial interpolation requires a certain degree of spatial continuity. This is in our case measured with the help of a Cramer- von Mises type statistic. Examples of daily precipitation measured over 4 regions with the number of stations ranging from 222 to 748 in South Germany, show a high degree of spatial continuity of the distributions. As a further step, the interpolation itself can be carried out by interpolating • The parameters of fitted Gamma or Weibull (or other appropriate) distributions • The moments of the distributions with a subsequent fit of parametric distributions • The quantiles of the distributions directly The interdependence between the variables to be interpolated makes this task extremely difficult in all three of the above cases. However a straightforward analysis of the higher quantiles shows that their interdependence is extremely strong, allowing simultaneous interpolation of quantiles using copulas. Lower quantiles are less well structured, but they are subject to higher observation errors and are likely to be of less importance in hydrology. Thus the interpolation was carried out on the basis of the quantiles corresponding to greater than 1mm/day values. Topographical influence on precipitation is considered as a covariate. The applied copula is a mixed truncated-Gaussian and Gaussian copula, which reflects the asymmetrical dependence between topography and precipitation quantiles. A split sampling and a cross validation methodology are used to evaluate the quality of the interpolation.
Analysis of Fuel Ethanol Transportation Activity and Potential Distribution Constraints
Das, Sujit [ORNL; Peterson, Bruce E [ORNL; Chin, Shih-Miao [ORNL
2010-01-01
This paper provides an analysis of fuel ethanol transportation activity and potential distribution constraints if the total 36 billion gallons of renewable fuel use by 2022 is mandated by EPA under the Energy Independence and Security Act (EISA) of 2007. Ethanol transport by domestic truck, marine, and rail distribution systems from ethanol refineries to blending terminals is estimated using Oak Ridge National Laboratory s (ORNL s) North American Infrastructure Network Model. Most supply and demand data provided by EPA were geo-coded and using available commercial sources the transportation infrastructure network was updated. The percentage increases in ton-mile movements by rail, waterways, and highways in 2022 are estimated to be 2.8%, 0.6%, and 0.13%, respectively, compared to the corresponding 2005 total domestic flows by various modes. Overall, a significantly higher level of future ethanol demand would have minimal impacts on transportation infrastructure. However, there will be spatial impacts and a significant level of investment required because of a considerable increase in rail traffic from refineries to ethanol distribution terminals.
Circularly symmetric distributed feedback semiconductor laser: An analysis
Erdogan, T.; Hall, D.G. (The Institute of Optics, University of Rochester, Rochester, New York 14627 (USA))
1990-08-15
We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describe the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find that the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser---it should have a superb quality output beam and is well-suited for array operation.
Circularly symmetric distributed feedback semiconductor laser: An analysis
Erdogan, T.; Hall, D.G.
1990-08-15
We analyze the near-threshold behavior of a circularly symmetric distributed feedback laser by developing a coupled-mode theory analysis for all azimuthal modes. We show that the equations that describes the low-order azimuthal modes are, to a very good approximation, the same as those for the one-dimensional (linear) distributed feedback laser. We examine the behavior of higher-order azimuthal modes by numerically solving the exact coupled-mode equations. We find that while a significant amount of mode discrimination exists among radial (longitudinal) modes, as in the one-dimensional distributed feedback laser, there is a much smaller degree of discrimination among azimuthal modes, indicating probability of multimode operation. Despite the multimode behavior, we find the frequency bandwidth associated with modes that do lase ought to be smaller than the spacing between Fabry-Perot modes of a typical semiconductor laser. This laser is an excellent candidate for a surface-emitting laser-it should have a superb quality output beam and is well-suited for array operation.
Analysis of vegetation distribution in relation to surface morphology
NASA Astrophysics Data System (ADS)
Savio, Francesca; Prosdocimi, Massimo; Tarolli, Paolo; Rulli, Cristina
2013-04-01
The scaling relationship between curvature, and local slope of a given point on the landscape and its drainage area reveal information about the dominant erosion processes over geomorphic time scales. Vegetation is known to influence erosion rates and landslide initiation, and also it is influenced by such processes and climatic regimes. Understanding the influence of vegetation dynamics on landscape organization is a fundamental challenge in the Earth Science field. In this study we considered two headwater catchments with vegetation mostly characterized by grass species (high altitude grassland), but also shrubs (mainly Alnus viridis), and high forest (mainly Picea abies) are common. We analyzed then the statistics related to vegetation distribution and different morphological patterns. High resolution LiDAR data served as the basis upon which derive Digital Terrain Models (DTMs) and mathematical attributes of landscape morphology including slope gradient, drainage area, aspect, surface curvature, topographic wetness index, slope - area and curvature - area loglog diagrams. The results reveal distinct differences in the curvature-area and slope-area relationships of each vegetation type. For a given drainage area, mean landscape slope is generally found to increase with woody vegetation. Profound landsliding signature is detected in areas interested by Alnus viridis distribution, thus underlining the relation between such pioneer species with slope instability. This preliminary analysis suggested that, when high resolution topography is available, is possible to better characterize the vegetation distribution based on surface morphology thus providing a useful tool for better understanding the processes and the role of vegetation in the landscape evolution.
Continuous Probability Distribution (CUPID) Analysis of Potentials for Internal Rotations
NASA Astrophysics Data System (ADS)
Džakula, Željko; Westler, William M.; Markley, John L.
1996-05-01
Thecontinuousprobabilitydistribution (CUPID) approach for analyzing the rotamer populations from NMR spin-spin couplings and nuclear Overhauser enhancements [Ž. Džakula, W. M. Westler, A. S. Edison, and J. L. Markley,J. Amer. Chem. Soc.114, 6195 (1992)] can be expanded to allow computation of the rotational potential from the Fourier coefficients of the angular probability distribution. This approach provides a general solution to the nonnegativity problem, which appears when lack of data causes a serious truncation in the Fourier series that defines the probability distribution. In favorable cases, this approach also allows thermodynamic characterization of internal rotation. Use of this extension of the CUPID method is illustrated by the analysis of internal rotations in an amino acid, two peptides, and an oligosaccharide from published experimental data. Three strategies have been devised for dealing with cases where the experimental input data do not provide enough information for complete reconstruction of the potential: (1) two-dimensional grid search for the undetermined third-order Fourier coefficients of the potential, (2) transfer of these coefficients from related model compounds, and (3) restriction of the magnitudes of the Fourier coefficients as required by the assumption of fast-exchange averaging of the input parameters. In addition, equations for translating uncertainties in experimental NMR input data into errors in calculated continuous probability distributions of rotamers are presented. The dependence of errors on various features of the distributions has been studied systematically from simulations. The results show that, typically, the confidence intervals are ±30-40° for dihedral angles and ±0.2 for rotamer populations. For ?1rotamers of amino acids, the analysis is most sensitive to the uncertainties in C?-H?couplings. A critical reexamination of the use of Gaussian functions to reconstruct a probability distribution is presented. In particular, the simplifying assumption of identical widths for all Gaussian probability peaks has been justified by showing that it does not lead to large errors in other CUPID parameters. Finally, the angular dependences of cross-relaxation rates, their uncertainties, and the potential for their use in studying ?1internal rotations in amino acids are discussed.
Data intensive high energy physics analysis in a distributed cloud
NASA Astrophysics Data System (ADS)
Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.
2012-02-01
We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.
Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions
NASA Astrophysics Data System (ADS)
Elmsheuser, Johannes; van der Ster, Daniel
2012-12-01
The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.
Prediction of the Inert Strength Distribution of Si3N4 Diesel Valves
Andrews, M.J.; Breder, K.; Wereszczak, A.A.
1999-01-25
Censored Weibull strength distributions were generated with NT551 silicon nitride four-point flexure data using the ASTM C1161-B and 5.0 mm diameter cylindrical specimens. Utilizing finite element models and AlliedSignal's life prediction codes, the inert or fast fracture strength failure probability of a ceramic diesel valve was estimated from these data sets. The failure probability prediction derived from each data set were found to be more conservative than valve strength data. Fractographic analysis of the test specimens and valves showed that the cylindrical specimens failed from a different flaw population than the prismatic flexure bars and the valves. The study emphasizes the prerequisite of having coincident flaw populations homogeneously distributed in both the test specimen and the ceramic component. Lastly, it suggests that unless material homogeneity exists, that any meaningful life prediction or reliability analysis of a component may not be possible.
Evaluation of Distribution Analysis Software for DER Applications
Staunton, RH
2003-01-23
The term ''Distributed energy resources'' or DER refers to a variety of compact, mostly self-contained power-generating technologies that can be combined with energy management and storage systems and used to improve the operation of the electricity distribution system, whether or not those technologies are connected to an electricity grid. Implementing DER can be as simple as installing a small electric generator to provide backup power at an electricity consumer's site. Or it can be a more complex system, highly integrated with the electricity grid and consisting of electricity generation, energy storage, and power management systems. DER devices provide opportunities for greater local control of electricity delivery and consumption. They also enable more efficient utilization of waste heat in combined cooling, heating and power (CHP) applications--boosting efficiency and lowering emissions. CHP systems can provide electricity, heat and hot water for industrial processes, space heating and cooling, refrigeration, and humidity control to improve indoor air quality. DER technologies are playing an increasingly important role in the nation's energy portfolio. They can be used to meet base load power, peaking power, backup power, remote power, power quality, as well as cooling and heating needs. DER systems, ranging in size and capacity from a few kilowatts up to 50 MW, can include a number of technologies (e.g., supply-side and demand-side) that can be located at or near the location where the energy is used. Information pertaining to DER technologies, application solutions, successful installations, etc., can be found at the U.S. Department of Energy's DER Internet site [1]. Market forces in the restructured electricity markets are making DER, both more common and more active in the distribution systems throughout the US [2]. If DER devices can be made even more competitive with central generation sources this trend will become unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of this modeling effort.
Phylogenetic analysis on the soil bacteria distributed in karst forest.
Zhou, Junpei; Huang, Ying; Mo, Minghe
2009-10-01
Phylogenetic composition of bacterial community in soil of a karst forest was analyzed by culture-independent molecular approach. The bacterial 16S rRNA gene was amplified directly from soil DNA and cloned to generate a library. After screening the clone library by RFLP, 16S rRNA genes of representative clones were sequenced and the bacterial community was analyzed phylogenetically. The 16S rRNA gene inserts of 190 clones randomly selected were analyzed by RFLP and generated 126 different RFLP types. After sequencing, 126 non-chimeric sequences were obtained, generating 113 phylotypes. Phylogenetic analysis revealed that the bacteria distributed in soil of the karst forest included the members assigning into Proteobacteria, Acidobacteria, Planctomycetes, Chloroflexi (Green nonsulfur bacteria), Bacteroidetes, Verrucomicrobia, Nitrospirae, Actinobacteria (High G+C Gram-positive bacteria), Firmicutes (Low G+C Gram-positive bacteria) and candidate divisions (including the SPAM and GN08). PMID:24031430
Tissue analysis of N-methylformamide: organ distribution.
Matook, G M; Spremulli, E N; Birmingham, B K; Calabresi, P; Griffiths, W C
1984-01-01
This report describes a gas chromatographic procedure, utilizing a packed column and flame ionization detector, suitable for the quantitative measurement of N-methylformamide (N-MF) in tissue samples. N-MF is a polar solvent that induces the maturation of cancer cells in vitro and, in vivo, exhibits antitumor activity with human tumors xenografted in nude (athymic) mice. Therapeutic monitoring is essential as toxicology studies have shown this compound to be hepatotoxic. N-MF is currently undergoing phase 1 clinical trials as an anticancer drug. This method of tissue analysis was developed to aid in the understanding of N-MF disposition and distribution in a murine model. The data thus generated may help predict the clinical behavior of this drug. PMID:6738004
A meta-analysis of parton distribution functions
NASA Astrophysics Data System (ADS)
Gao, Jun; Nadolsky, Pavel
2014-07-01
A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+? s uncertainty at a common QCD coupling strength of 0.118.
A meta-analysis of parton distribution functions
Gao, Jun
2014-01-01
A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three inpu...
Silk Fiber Mechanics from Multiscale Force Distribution Analysis
Cetinkaya, Murat; Xiao, Senbo; Markert, Bernd; Stacklies, Wolfram; Gräter, Frauke
2011-01-01
Here we decipher the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach incorporates molecular dynamics and finite element simulations. Therefore, the approach allows the analysis of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular and continuum mechanics. We thereby dissect the contributions from the nanoscale building blocks, the soft amorphous and the strong crystalline subunits, to silk fiber mechanics. We identify the amorphous subunits not only to give rise to high elasticity, but to also ensure efficient stress homogenization through the friction between entangled chains, which also allows the crystals to withstand stresses as high as 2 GPa in the context of the amorphous matrix. We show that the maximal toughness of silk is achieved at 10–40% crystallinity depending on the distribution of crystals in the fiber. We also determined a serial arrangement of the crystalline and amorphous subunits in lamellae to outperform a random or a parallel arrangement, putting forward what we believe to be a new structural model for silk and other semicrystalline materials. The multiscale approach, not requiring any empirical parameters, is applicable to other partially ordered polymeric systems. Hence, it is an efficient tool for the design of artificial silk fibers. PMID:21354403
NASA Astrophysics Data System (ADS)
Alves, Nelson A.; Morero, Lucas D.; Rizzi, Leandro G.
2015-06-01
Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature ?(E) and the microcanonical entropy S(E) is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms H(E) , which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for H(E) in order to evaluate ?(E) by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distribution function (CDF), and (ii) a Bayesian approach to model this CDF. Comparison with a simple linear regression method is also carried out. The performance of these approaches is evaluated considering coarse-grained protein models for folding and peptide aggregation.
An Open Architecture for Distributed Malware Collection and Analysis
NASA Astrophysics Data System (ADS)
Cavalca, Davide; Goldoni, Emanuele
Honeynets have become an important tool for researchers and network operators. However, the lack of a unified honeynet data model has impeded their effectiveness, resulting in multiple unrelated data sources, each with its own proprietary access method and format. Moreover, the deployment and management of a honeynet is a time-consuming activity and the interpretation of collected data is far from trivial. HIVE (Honeynet Infrastructure in Virtualized Environment) is a novel highly scalable automated data collection and analysis architecture we designed. Our infrastructure is based on top of proven FLOSS (Free, Libre and Open Source) solutions, which have been extended and integrated with new tools we developed. We use virtualization to ease honeypot management and deployment, combining both high-interaction and low-interaction sensors in a common infrastructure. We also address the need for rapid comprehension and detailed data analysis by harnessing the power of a relational database system, which provides centralized storage and access to the collected data while ensuring its constant integrity. This chapter presents our malware data collection architecture, offering some insight in the structure and benefits of a distributed virtualized honeynet and its development. Finally, we present some techniques for the active monitoring of centralized botnets we integrated in HIVE, which allow us to track the menaces evolution and timely deploy effective countermeasures.
On the Empirical-Statistical Modeling of SAR Images With Generalized Gamma Distribution
Heng-Chao Li; Wen Hong; Yi-Rong Wu; Ping-Zhi Fan
2011-01-01
In this paper, an efficient statistical model, called generalized Gamma distribution ( ), for the empirical mod- eling of synthetic aperture radar (SAR) images is proposed. The forms a large variety of alternative distributions (especially including Rayleigh, exponential, Nakagami, Gamma, Weibull, and log-normal distributions commonly used for the probability den- sity function (pdf) of SAR images as special cases), and
Comparative analysis of aerosols elemental distribution in some Romanian regions
NASA Astrophysics Data System (ADS)
Amemiya, Susumu; Masuda, Toshio; Popa-Simil, Liviu; Mateescu, Liviu
1996-04-01
The study's main aim is obtaining aerosols particulate elemental distribution and mapping it for some Romanian regions, in order to obtain preliminary information regarding the concentrations of aerosol particles and networking strategy versus local conditions. For this we used the mobile sampling strategy, but taking care on all local specific conditions and weather. In the summer of 1993, in July we took about 8 samples on a rather large territory of SE Romania which were analysed and mapped. The regions which showed an interesting behaviour or doubts such as Bucharest and Dobrogea were zoomed in near the same period of 1994, for comparing the new details with the global aspect previously obtained. An attempt was made to infer the minimum necessary number of stations in a future monitoring network. A mobile sampler was used, having tow polycarbonate filter posts of 8 and 0.4 ?m. PIXE elemental analysis was performed on a 2.5 MV Van de Graaff accelerator, by using a proton beam. More than 15 elements were measured. Suggestive 2D and 3D representations were drawn, as well as histogram charts for the concentrations' distribution in the specific regions at the specified times. In spite of the poor samples from the qualitative point of view the experiment surprised us by the good coincidence (good agreement) with realities in terrain known by other means long time ago, and highlighted the power of PIXE methods in terms of money and time. Conclusions over the link between industry, traffic, vegetation, wether, surface waters, soil composition, power plant exhaust and so on, on the one hand, and surface concentration distribution, on the other, were drawn. But the method's weak points were also highlighted; these are weather dependencies (especially air masses movement and precipitation), local relief, microclimate and vegetation, and of course localisation of the sampling point versus the pollution sources and their regime. The paper contains a synthesis of the whole of the maps and graphs we made, intended in its turn to demonstrate the necessity of a national integrated network for monitoring aerosols.
Stability Analysis of Distributed Order Fractional Chen System
Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.
2013-01-01
We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508
A mathematical analysis of the DCT coefficient distributions for images
Edmund Y. Lam; Joseph W. Goodman
2000-01-01
Over the past two decades, there have been various studies on the distributions of the DCT coefficients for images. However, they have concentrated only on fitting the empirical data from some standard pictures with a variety of well-known statistical distributions, and then comparing their goodness of fit. The Laplacian distribution is the dominant choice balancing simplicity of the model and
Statistical structural analysis of rotor impact ice shedding
NASA Technical Reports Server (NTRS)
Kellacky, C. J.; Chu, M. L.; Scavuzzo, R. J.
1991-01-01
The statistical characteristics of impact ice shear strength are analyzed, with emphasis placed on the most probable shear strength and statistical distribution of an ice deposit. Several distribution types are considered: the Weibull, two-parameter Weibull, and exponential distributions, as well as the Gumbell distribution of the smallest extreme and the Gumbell distribution of the largest extreme. It is concluded that the Weibull distribution yields the best results; however, the expected life, shape parameter, and scale parameter should be determined separately for each case of varying wind speed and droplet size. The theoretical predictions of shear stresses in a specific rotating ice shape are compared, and it is noted that when the effects of lift are added to the theoretical model and the interference is calculated with a new mean and standard deviation, the probability of ice shed is computed as 36.64 pct.
A distributed analysis of Human impact on global sediment dynamics
NASA Astrophysics Data System (ADS)
Cohen, S.; Kettner, A.; Syvitski, J. P.
2012-12-01
Understanding riverine sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. Ever increasing human activity during the Anthropocene have affected sediment dynamics in two major ways: (1) an increase is hillslope erosion due to agriculture, deforestation and landscape engineering and (2) trapping of sediment in dams and other man-made reservoirs. The intensity and dynamics between these man-made factors vary widely across the globe and in time and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment flux and water discharge model (WBMsed) to compare a pristine (without human input) and disturbed (with human input) simulations. Using these 50 year simulations we will show and discuss the complex spatial and temporal patterns of human effect on riverine sediment flux and water discharge.
Novel physical interpretations of K-distributed reverberation
Douglas A. Abraham; Anthony P. Lyons
2002-01-01
Interest in describing and modeling envelope distributions of sea-floor backscatter has increased recently, particularly with regard to high-resolution active sonar systems. Sea-floor scattering that results in heavy-tailed-matched-filter-envelope probability distribution functions (i.e., non-Rayleigh distributions exemplified by the K, Weibull, Rayleigh mixture, or log-normal distributions) is often the limiting factor in the performance of these types of sonar systems and in this
Statistical distribution of mechanical properties for three graphite-epoxy material systems
NASA Technical Reports Server (NTRS)
Reese, C.; Sorem, J., Jr.
1981-01-01
Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.
Application of extreme learning machine for estimation of wind speed distribution
NASA Astrophysics Data System (ADS)
Shamshirband, Shahaboddin; Mohammadi, Kasra; Tong, Chong Wen; Petkovi?, Dalibor; Porcu, Emilio; Mostafaeipour, Ali; Ch, Sudheer; Sedaghat, Ahmad
2015-06-01
The knowledge of the probabilistic wind speed distribution is of particular significance in reliable evaluation of the wind energy potential and effective adoption of site specific wind turbines. Among all proposed probability density functions, the two-parameter Weibull function has been extensively endorsed and utilized to model wind speeds and express wind speed distribution in various locations. In this research work, extreme learning machine (ELM) is employed to compute the shape (k) and scale (c) factors of Weibull distribution function. The developed ELM model is trained and tested based upon two widely successful methods used to estimate k and c parameters. The efficiency and accuracy of ELM is compared against support vector machine, artificial neural network and genetic programming for estimating the same Weibull parameters. The survey results reveal that applying ELM approach is eventuated in attaining further precision for estimation of both Weibull parameters compared to other methods evaluated. Mean absolute percentage error, mean absolute bias error and root mean square error for k are 8.4600 %, 0.1783 and 0.2371, while for c are 0.2143 %, 0.0118 and 0.0192 m/s, respectively. In conclusion, it is conclusively found that application of ELM is particularly promising as an alternative method to estimate Weibull k and c factors.
Plotting formula for pearson type III distribution considering historical information.
Van Nguyen, T V; In-Na, N
1992-12-01
Most of the existing plotting position formulas have been proposed for use in the analysis of systematic flood records, but little has been reported on the plotting formulas for historical or non-systematic flood samples. In particular, no previous investigations have specifically examined the probability plots for the Pearson type III (P3) distribution in the analysis of historical flood information. The present paper suggests a new plotting position formula for the P3 distribution for use with both systematic and historical flood records. The proposed formula has a simple structure as do most existing formulas, but it is more flexible because in can take explicitly into account the skewness coefficient of the underlying distribution. Further, results of graphical and numerical comparisons have demonstrated that the suggested formula provided the least bias in flood quantile estimation as compared with many available plotting formulas, including the well-known Weibull formula. Finally, results of a numerical example using actual flood data have indicated the practica convenience of the proposed plotting formula. It can be concluded that the formula developed in this study is the most appropriate for the P3 distribution in the analysis of flood records considering historical information. PMID:24227096
Frequency distribution histograms for the rapid analysis of data
NASA Technical Reports Server (NTRS)
Burke, P. V.; Bullen, B. L.; Poff, K. L.
1988-01-01
The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.
Causality and sensitivity analysis in distributed design simulation
Kim, Jaehyun, 1970-
2002-01-01
Numerous collaborative design frameworks have been developed to accelerate the product development, and recently environments for building distributed simulations have been proposed. For example, a simulation framework ...
Poli, Riccardo
- pling distribution of the PSO. In this paper we introduce a novel method, which allows one to exactly determine all the characteristics of a PSO's sampling distribution and ex- plain how they change over any the analysis to the PSO with inertia weight, but the analysis is also valid for the PSO with constriction
ERIC Educational Resources Information Center
Hayton, James C.
2009-01-01
In the article "Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data," Dinno (this issue) provides strong evidence that the distribution of random data does not have a significant influence on the outcome of the analysis. Hayton appreciates the thorough approach to evaluating this assumption, and agrees…
SNDS: A Distributed Monitoring and Protocol Analysis System for Wireless Sensor Network
Xin Kuang; Jianhua Shen
2010-01-01
Monitoring a large-scale wireless sensor networks (WSN) is very difficult, not only because it is large and complex, much of difficulty comes from the lack of visual analysis tools. This paper describes SNDS (Sensor Network Distributed Sniffer), a distributed monitoring and protocol analysis system for large and complex sensor networks. SNDS is based on the use of sniffers co-deployed with
Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity
Englehardt, James D.
2015-01-01
Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
Analysis of Parallel Downloading for Large File Distribution
Simon G. M. Koo; Catherine Rosenberg; Dongyan Xu
2003-01-01
Recently, the scheme of parallel downloading (PD) has been adopted by a number of Internet file download- ing applications. With the wide deployment of content distribution networks and peer-to-peer networks, PD is expected to be more commonly used for file distribution. There have been experiments showing that PD results in higher aggregated downloading throughput and therefore shorter downloading time experienced
HIERARCHICAL ANALYSIS OF SPECIES DISTRIBUTIONS AND ABUNDANCE ACROSS ENVIRONMENTAL GRADIENTS
JEFFREY M. DlEZl; H. Ronald Pulliam
2007-01-01
Abiotic and biotic processes operate at multiple spatial and temporal scales to shape many ecological processes, including species distributions and demography. Current debate about the relative roles of niche-based and stochastic processes in shaping species distributions and community composition reflects, in part, the challenge of understanding how these processes interact across scales. Traditional statistical models that ignore autocorrelation and spatial
Analysis of vegetation distribution in Interior Alaska and sensitivity to
McGuire, A. David
, T. Scott Rupp4 and Herman H. Shugart3 INTRODUCTION Greenhouse gases released by human activities To understand drivers of vegetation type distribution and sensitivity to climate change. Location Interior distribution of four major vegetation types: tundra, deciduous forest, black spruce forest and white spruce
A global analysis of root distributions for terrestrial biomes
R. B. Jackson; J. Canadell; J. R. Ehleringer; H. A. Mooney; O. E. Sala; E. D. Schulze
1996-01-01
Understanding and predicting ecosystem functioning (e.g., carbon and water fluxes) and the role of soils in carbon storage requires an accurate assessment of plant rooting distributions. Here, in a comprehensive literature synthesis, we analyze rooting patterns for terrestrial biomes and compare distributions for various plant functional groups. We compiled a database of 250 root studies, subdividing suitable results into 11
Analysis of paging in distributed architectures for 4G systems
Rajeev Agrawal; Anand S. Bedekar; Suresh Kalyanasundaram
2007-01-01
As 3G wireless systems evolve towards 4G, various wireless network technology organizations looking at network architectures for 4G are considering a redesign of the network away from the traditional centralized, hierarchical design towards a more distributed operation of network functions. In this paper, we present some mechanisms for distributed operation of paging in 4G systems. Our focus is on highlighting
Structural Vulnerability Analysis of Electric Power Distribution Grids
Koc, Yakup; Warnier, Martijn; Kumar, Tarun
2015-01-01
Power grid outages cause huge economical and societal costs. Disruptions in the power distribution grid are responsible for a significant fraction of electric power unavailability to customers. The impact of extreme weather conditions, continuously increasing demand, and the over-ageing of assets in the grid, deteriorates the safety of electric power delivery in the near future. It is this dependence on electric power that necessitates further research in the power distribution grid security assessment. Thus measures to analyze the robustness characteristics and to identify vulnerabilities as they exist in the grid are of utmost importance. This research investigates exactly those concepts- the vulnerability and robustness of power distribution grids from a topological point of view, and proposes a metric to quantify them with respect to assets in a distribution grid. Real-world data is used to demonstrate the applicability of the proposed metric as a tool to assess the criticality of assets in a distribution...
Dynamic load Variation and Stability Analysis in Distribution Networks with Distributed Generators
Pota, Himanshu Roy
, voltage profile, network reconfigurations. I. INTRODUCTION Power system is traditionally designed to transport power from generation units through transmission and distribution networks, finally energy sources (small hydro, modern biomass, wind, solar, geothermal, and bio fuels). The annual growth
Distributed processing and analysis of ATLAS experimental data
Barberis, D; The ATLAS collaboration
2011-01-01
The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...
Distributed processing and analysis of ATLAS experimental data
Barberis, D; The ATLAS collaboration
2011-01-01
The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...
Numerical Analysis of a Cold Air Distribution System
Zhu, L.; Li, R.; Yuan, D.
2006-01-01
Cold air distribution systems may reduce the operating energy consumption of air-conditioned air supply system and improve the outside air volume percentages and indoor air quality. However, indoor temperature patterns and velocity field are easily...
Determination analysis of energy conservation standards for distribution transformers
Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.
1996-07-01
This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.
Economic Analysis of Trickle Distribution System Texas High Plains.
Osborn, James E.; Young, Alan M.; Wilke, Otto C.; Wendt, Charles
1977-01-01
.. . . ................. . .. .. . . . . . ... . . ..... . ... . 4 Economic Analysis .... . ........ . ........ . ............. . . . . ... . 4 Cost-Return Budgets ............. . .. . . . ..... . ....... . ...... . 4 Break-Even Analysis ....................... . . . ..... . ... . . . . . 6 Results... ...... .. . . . . ... . .......................... . ... . .... . . . 7 Sorghum .. . . ... .... . ........ . ...... . ............. .. ...... .. . 8 Break-Even Prices Per Unit of Output ...... . . ....... . ....... .. . . . 8 Solid Cotton .... . .. . ... ... . . . . ............... . .... .. . ... . ... . 8 Double...
Analysis of Fermi gamma-ray burst duration distribution
Mariusz Tarnopolski
2015-07-07
Two classes of GRBs, short and long, have been determined without any doubts, and are usually prescribed to different physical scenarios. A third class, intermediate in $T_{90}$ durations, has been reported to be present in the datasets of BATSE, Swift, RHESSI and possibly BeppoSAX. The latest release of $>1500$ GRBs observed by Fermi gives an opportunity to further investigate the duration distribution. The aim of this paper is to investigate whether a third class is present in the $\\log T_{90}$ distribution, or is it described by a bimodal distribution. A standard $\\chi^2$ fitting of a mixture of Gaussians is applied to 25 histograms with different binnings. Different binnings give various values of the fitting parameters, as well as the shape of the fitted curve. Among five statistically significant fits none is trimodal. Locations of the Gaussian components are in agreement with previous works. However, a trimodal distribution, understood in the sense of having three separated peaks, is not found for any binning. It is concluded that the duration distribution in Fermi data is well described by a mixture of three log-normal distributions, but it is intrinsically bimodal, hence no third class is present in the $T_{90}$ data of Fermi. It is suggested that the log-normal fit may not be an adequate model.
Analysis of Fermi gamma-ray burst duration distribution
NASA Astrophysics Data System (ADS)
Tarnopolski, M.
2015-09-01
Context. Two classes of gamma-ray bursts (GRBs), short and long, have been determined without any doubts, and are usually prescribed to different physical scenarios. A third class, intermediate in T90 durations has been reported in the datasets of BATSE, Swift, RHESSI, and possibly BeppoSAX. The latest release of >1500 GRBs observed by Fermi gives an opportunity to further investigate the duration distribution. Aims: The aim of this paper is to investigate whether a third class is present in the log T90 distribution, or whether it is described by a bimodal distribution. Methods: A standard ?2 fitting of a mixture of Gaussians was applied to 25 histograms with different binnings. Results: Different binnings give various values of the fitting parameters, as well as the shape of the fitted curve. Among five statistically significant fits, none is trimodal. Conclusions: Locations of the Gaussian components are in agreement with previous works. However, a trimodal distribution, understood in the sense of having three distinct peaks, is not found for any binning. It is concluded that the duration distribution in the Fermi data is well described by a mixture of three log-normal distributions, but it is intrinsically bimodal, hence no third class is present in the T90 data of Fermi. It is suggested that the log-normal fit may not be an adequate model.
Performance Analysis of Distributed Object-Oriented Applications
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1998-01-01
The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.
Pérez, Isidro A; Sánchez, M Luisa; García, M Ángeles; Pardo, Nuria
2013-07-01
CO? concentrations recorded for two years using a Picarro G1301 analyser at a rural site were studied applying two procedures. Firstly, the smoothing kernel method, which to date has been used with one linear and another circular variable, was used with pairs of circular variables: wind direction, time of day, and time of year, providing that the daily cycle was the prevailing cyclical evolution and that the highest concentrations were justified by the influence of one nearby city source, which was only revealed by directional analysis. Secondly, histograms were obtained, and these revealed most observations to be located between 380 and 410 ppm, and that there was a sharp contrast during the year. Finally, histograms were fitted to 14 distributions, the best known using analytical procedures, and the remainder using numerical procedures. RMSE was used as the goodness of fit indicator to compare and select distributions. Most functions provided similar RMSE values. However, the best fits were obtained using numerical procedures due to their greater flexibility, the triangular distribution being the simplest function of this kind. This distribution allowed us to identify directions and months of noticeable CO? input (SSE and April-May, respectively) as well as the daily cycle of the distribution symmetry. Among the functions whose parameters were calculated using an analytical expression, Erlang distributions provided satisfactory fits for monthly analysis, and gamma for the rest. By contrast, the Rayleigh and Weibull distributions gave the worst RMSE values. PMID:23602977
Income distribution dependence of poverty measure: A theoretical analysis
NASA Astrophysics Data System (ADS)
Chattopadhyay, Amit K.; Mallick, Sushanta K.
2007-04-01
Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the ‘global’ mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy.
Comparative hypsometric analysis of both Earth and Venus topographic distributions
NASA Technical Reports Server (NTRS)
Rosenblatt, P.; Pinet, P. C.; Thouvenot, E.
1993-01-01
Previous studies have compared the global topographic distribution of both planets by means of differential hypsometric curves. For the purpose of comparison, the terrestrial oceanic load was removed, and a reference base level was acquired. It was chosen on the basis of geometric considerations and reflected the geometric shape of the mean dynamical equilibrium figure of the planetary surface in both cases. This reference level corresponds to the well-known sea level for the Earth; for Venus, given its slow rate of rotation, a sphere of radius close to the mean, median and modal values of the planetary radii distribution were considered and the radius value of 6051 km arbitrarily taken. These studies were based on the low resolution (100 x 100 sq km) coverage of Venus obtained by the Pioneer Venus altimeter and on the 1 deg x 1 deg terrestrial topography. But, apart from revealing the distinct contrast existing between the Earth's bimodal and Venus' strong unimodal topographic distribution, the choice of such a reference level is inadequate and even misleading for the comparative geophysical understanding of the planetary relief distribution. The present work reinvestigates the comparison between Earth and Venus hypsometric distribution on the basis of the high-resolution data provided, on one hand, by the recent Magellan global topographic coverage of Venus' surface, and on the other hand, by the detailed NCAR 5 x 5 ft. grid topographic database currently available for the Earth's surface.
NASA Astrophysics Data System (ADS)
Taravat, A.; Del Frate, F.
2013-09-01
As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
2003-01-01
This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.
Inductance and Current Distribution Analysis of a Prototype HTS Cable
NASA Astrophysics Data System (ADS)
Zhu, Jiahui; Zhang, Zhenyu; Zhang, Huiming; Zhang, Min; Qiu, Ming; Yuan, Weijia
2014-05-01
This project is partly supported by NSFC Grant 51207146, RAEng Research Exchange scheme of UK and EPSRC EP/K01496X/1. Superconducting cable is an emerging technology for electricity power transmission. Since the high power capacity HTS transmission cables are manufactured using a multi-layer conductor structure, the current distribution among the multilayer structure would be nonuniform without proper optimization and hence lead to large transmission losses. Therefore a novel optimization method has been developed to achieve evenly distributed current among different layers considering the HTS cable structure parameters: radius, pitch angle and winding direction which determine the self and mutual inductance. A prototype HTS cable has been built using BSCCO tape and tested to validate the design the optimal design method. A superconductor characterization system has been developed using the Labview and NI data acquisition system. It can be used to measure the AC loss and current distribution of short HTS cables.
Analysis and machine mapping of the distribution of band recoveries
Cowardin, L.M.
1977-01-01
A method of calculating distance and bearing from banding site to recovery location based on the solution of a spherical triangle is presented. X and Y distances on an ordinate grid were applied to computer plotting of recoveries on a map. The advantages and disadvantages of tables of recoveries by State or degree block, axial lines, and distance of recovery from banding site for presentation and comparison of the spatial distribution of band recoveries are discussed. A special web-shaped partition formed by concentric circles about the point of banding and great circles at 30-degree intervals through the point of banding has certain advantages over other methods. Comparison of distributions by means of a X? contingency test is illustrated. The statistic V = X?/N can be used as a measure of difference between two distributions of band recoveries and its possible use is illustrated as a measure of the degree of migrational homing.
DATA BASE DEVELOPMENT AND ANALYSIS FOR WATER DISTRIBUTION SYSTEMS
There have been a number of studies estimating the costs of repairing, replacing, or renovating the authors aging distribution networks. Although useful in general terms, characterizing a system by age and/or breaks per mile of mains is of limited use to a utility trying to manag...
Geostatistical analysis of the spatial distribution of soil salinity
Barbara Cafarelli; Alessio Pollice
A correct evaluation of the causes and amount of salinity in a soil has an agronomical as well as environmental relevance and is dealt with in the context of precision agriculture. In this paper a geoadditive model is used to analyse the spatial distribution of an indicator of soil salinity and its nonlinear relations with soil physical, chemical and hydraulic
A preliminary analysis and model of prostate injection distributions
Scott L. Chowning; Robert C. Susil; Axel Krieger; Gabor Fichtinger; Louis L. Whitcomb; Ergin Atalar
2006-01-01
PURPOSE. Understanding the internal dynamics of prostate injections, particularly injection pattern distribution is a key step to developing new therapies for prostate disease that may be best served with a direct injection approach. Due to excellent properties involving liquid contrast agents, MRI can be used for targeting and monitoring of injections into organs and tissues. MATERIALSANDMETHODS. Eleven intraprostatic injections were
Preliminary distributional analysis of US endangered bird species
MANDALINE E. GODOWN; A. TOWNSEND PETERSON
2000-01-01
A first exploration of applications of ecological niche modeling and geographic distributional prediction to endangered species protection is developed. Foci of richness of endangered bird species are identified in coastal California and along the southern fringe of the United States. Species included on the Endangered Species List on the basis of peripheral populations inflate these concentrations considerably. Species without protection
Introduction to special section on Analysis of Zooplankton Distributions Using
demonstrate that the OPC and the new laser OPC (LOPC) are useful tools for mapping fine-scale distributions of zooplankton over broad expanses of space and for examining patterns in the size structure of zooplankton, there are many lake and coastal situations where zooplankton are so abundant as to cause coincidence problems
THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS
The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...
Analysis of dynamic foot pressure distribution and ground reaction forces
NASA Astrophysics Data System (ADS)
Ong, F. R.; Wong, T. S.
2005-04-01
The purpose of this study was to assess the relationship between forces derived from in-shoe pressure distribution and GRFs during normal gait. The relationship served to demonstrate the accuracy and reliability of the in-shoe pressure sensor. The in-shoe pressure distribution from Tekscan F-Scan system outputs vertical forces and Centre of Force (COF), while the Kistler force plate gives ground reaction forces (GRFs) in terms of Fz, Fx and Fy, as well as vertical torque, Tz. The two systems were synchronized for pressure and GRFs measurements. Data was collected from four volunteers through three trials for both left and right foot under barefoot condition with the in-shoe sensor. The forces derived from pressure distribution correlated well with the vertical GRFs, and the correlation coefficient (r2) was in the range of 0.93 to 0.99. This is a result of extended calibration, which improves pressure measurement to give better accuracy and reliability. The COF from in-shoe sensor generally matched well with the force plate COP. As for the maximum vertical torque at the forefoot during toe-off, there was no relationship with the pressure distribution. However, the maximum torque was shown to give an indication of the rotational angle of the foot.
Analysis of aerosol vertical distribution and variability in Hong Kong
Qianshan He; Chengcai Li; Jietai Mao; Alexis Kai-Hon Lau; D. A. Chu
2008-01-01
Aerosol vertical distribution is an important piece of information to improve aerosol retrieval from satellite remote sensing. Aerosol extinction coefficient profile and its integral form, aerosol optical depth (AOD), as well as atmospheric boundary layer (ABL) height and haze layer height can be derived using lidar measurements. In this paper, we used micropulse lidar measurements acquired from May 2003 to
High Resolution PV Power Modeling for Distribution Circuit Analysis
Norris, B. L.; Dise, J. H.
2013-09-01
NREL has contracted with Clean Power Research to provide 1-minute simulation datasets of PV systems located at three high penetration distribution feeders in the service territory of Southern California Edison (SCE): Porterville, Palmdale, and Fontana, California. The resulting PV simulations will be used to separately model the electrical circuits to determine the impacts of PV on circuit operations.
Income Distribution and Redistribution: A Microdata Analysis for Seven Countries
Michael OHiggins; Guenther Schmaus; Geoffrey Stephenson
1989-01-01
This paper reports the detailed results of a comparison of the distribution and redistribution of income in seven countries using the Luxembourg Income Study (LIS) database. Use of LIS facilitates comparisons of inequality in respect to similarly-defined variables, permits methodological alternatives to be used, and allows the countries to be compared on aspects of income ranking and policy equity in
Analysis of Fermi Gamma-Ray Burst duration distribution
Tarnopolski, Mariusz
2015-01-01
Two classes of GRBs, short and long, have been determined without any doubts, and are usually prescribed to different physical scenarios. A third class, intermediate in $T_{90}$ durations, has been reported to be present in the datasets of BATSE, Swift, RHESSI and possibly BeppoSAX. The latest release of $>1500$ GRBs observed by Fermi gives an opportunity to further investigate the duration distribution. The aim of this paper is to investigate whether a third class is present in the $\\log T_{90}$ distribution, or is it described by a bimodal distribution. A standard $\\chi^2$ fitting of a mixture of Gaussians is applied to 25 histograms with different binnings. Different binnings give various values of the fitting parameters, as well as the shape of the fitted curve. Among five statistically significant fits none is trimodal. Locations of the Gaussian components are in agreement with previous works. However, a trimodal distribution, understood in the sense of having three separated peaks, is not found for any ...
Conflict classification and analysis of distributed firewall policies
Ehab Al-shaer; Hazem Hamed; Raouf Boutaba; Masum Hasan
2005-01-01
Firewalls are core elements in network security. However, managing firewall rules, particularly, in multifirewall enterprise networks, has become a complex and error-prone task. Firewall filtering rules have to be written, ordered, and distributed carefully in order to avoid firewall policy anomalies that might cause network vulnerability. Therefore, inserting or modifying filtering rules in any firewall requires thorough intrafirewall and interfirewall
Analysis of Distributed Intrusion Detection Systems Using Mobile Agents
Nita Patil; Chhaya Das; Shreya Patankar; Kshitija Pol
2008-01-01
The goal of IDS is to analyze events on the network and identify attacks. The increasing number of network security related incidents makes it necessary for organizations to actively protect their sensitive data with the installation of intrusion detection systems (IDS). Detecting intrusion in distributed network from outside network segment as well as from inside is a difficult problem. Intrusion
NASA Astrophysics Data System (ADS)
Sharma, V. K.; Patil, R. S.
In situ measurements of mass concentration of size-distributed aerosols were made using a quartz crystal microbalance cascade impactor. Aerosol samples were also collected by the conventional high-volume sampler for comparison and analysed for size distribution using a centrifugal analyser system and an image analyser system. The number concentrations were calculated for different sizes and these were subjected to factor analysis which gave four factors representing various source types of particulates. A power-function fit was applied to the size-distribution curves for the four size ranges grouped by factor analysis. Generally, size-distribution ranges are either selected according to the change of the slope of the curves or depending upon the measurement size ranges. The use of factor analysis makes the size-distribution groupings source dependent and also avoids the possible errors arising from averaging negative and positive slopes.
ERIC Educational Resources Information Center
Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire
2013-01-01
This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…
A network analysis of food flows within the United States of America.
Lin, Xiaowen; Dang, Qian; Konar, Megan
2014-05-20
The world food system is globalized and interconnected, in which trade plays an increasingly important role in facilitating food availability. We present a novel application of network analysis to domestic food flows within the USA, a country with global importance as a major agricultural producer and trade power. We find normal node degree distributions and Weibull node strength and betweenness centrality distributions. An unassortative network structure with high clustering coefficients exists. These network properties indicate that the USA food flow network is highly social and well-mixed. However, a power law relationship between node betweenness centrality and node degree indicates potential network vulnerability to the disturbance of key nodes. We perform an equality analysis which serves as a benchmark for global food trade, where the Gini coefficient = 0.579, Lorenz asymmetry coefficient = 0.966, and Hoover index = 0.442. These findings shed insight into trade network scaling and proxy free trade and equitable network architectures. PMID:24773310
SYNTHESIS Fighting their last stand? A global analysis of the distribution and
Butler, David R. - Department of Geography, Texas State University
SYNTHESIS Fighting their last stand? A global analysis of the distribution and conservation status at higher latitudes and elevations. The aim of our synthesis was to test these statements by investigating
Category induction via distributional analysis: Evidence from a serial reaction time task
Makous, Walter
Category induction via distributional analysis: Evidence from a serial reaction time task Ruskin H a serial reaction time task. Artificial grammars generated corpora of input strings containing 2009 Available online 5 December 2009 Keywords: Statistical learning Category induction Serial reaction
Hunter, David J.
To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist–hip ...
Nonlinear structural analysis on distributed-memory computers
NASA Technical Reports Server (NTRS)
Watson, Brian C.; Noor, Ahmed K.
1995-01-01
A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a nested dissection (or multilevel substructuring) ordering scheme; (3) parallel assembly of global matrices; and (4) a parallel sparse equation solver. The effectiveness of the strategy is assessed by applying it to thermo-mechanical postbuckling analyses of stiffened composite panels with cutouts, and nonlinear large-deflection analyses of HSCT models on Intel Paragon XP/S computers. The numerical studies presented demonstrate the advantages of nested dissection-based solvers over traditional skyline-based solvers on distributed memory machines.
Network analysis of ground currents in a residential distribution system
Mader, D.L. (Ontario Hydro, Toronto (Canada)); Zaffanella, L.E. (Electric Power Research Inst., Lenox, MA (United States))
1993-01-01
Reports of an association between cancer and high capacity distribution lines have prompted concerns about 60-Hz magnetic fields. In a study on sources of fields, ground currents on distribution system wires and metal water pipes have been measured at a residential magnetic-field research facility. An analytical method with good simulation capability is presented in terms of an electrical network model and formulas to calculate resistances and self-inductances for all wires and pipes and mutual inductances for pairs of conductors. At this particular site, resistances of joints between pipes in the water main were found to be significant. When joint resistances were included in the model, the difference between measured and calculated ground currents decreased from 89% to 20%. The authors also show the need to include mutual inductance.
Exploring Vector Fields with Distribution-based Streamline Analysis
Lu, Kewei; Chaudhuri, Abon; Lee, Teng-Yok; Shen, Han-Wei; Wong, Pak C.
2013-02-26
Streamline-based techniques are designed based on the idea that properties of streamlines are indicative of features in the underlying field. In this paper, we show that statistical distributions of measurements along the trajectory of a streamline can be used as a robust and effective descriptor to measure the similarity between streamlines. With the distribution-based approach, we present a framework for interactive exploration of 3D vector fields with streamline query and clustering. Streamline queries allow us to rapidly identify streamlines that share similar geometric features to the target streamline. Streamline clustering allows us to group together streamlines of similar shapes. Based on users selection, different clusters with different features at different levels of detail can be visualized to highlight features in 3D flow fields. We demonstrate the utility of our framework with simulation data sets of varying nature and size.
Nonlinear structural analysis on distributed-memory computers
Brian C. Watson; Ahmed K. Noor
1995-01-01
A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a nested dissection (or multilevel substructuring) ordering scheme; (3) parallel assembly of global matrices; and
Nonlinear structural analysis on distributed-memory computers
B. C. Watson; A. K. Noor
1996-01-01
A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (a) a multiple-parameter reduced basis technique; (b) a nested dissection (or multilevel substructuring) ordering scheme; (c) parallel assembly of global matrices; and
Analysis of phase distribution phenomena in microgravity environments
NASA Technical Reports Server (NTRS)
Lahey, Richard T., Jr.; Bonetto, F.
1994-01-01
The purpose of the research presented in this paper is to demonstrate the ability of multidimensional two-fluid models for bubbly two-phase flow to accurately predict lateral phase distribution phenomena in microgravity environments. If successful, this research should provide NASA with mechanistically-based analytical methods which can be used for multiphase space system design and evaluation, and should be the basis for future shuttle experiments for model verification.
Periodic analysis of total ozone and its vertical distribution
NASA Technical Reports Server (NTRS)
Wilcox, R. W.; Nastrom, G. D.; Belmont, A. D.
1975-01-01
Both total ozone and vertical distribution ozone data from the period 1957 to 1972 are analyzed. For total ozone, improved monthly zonal means for both hemispheres are computed by weighting individual station monthly means by a factor which compensates for the close grouping of stations in certain regions of latitude bands. Longitudinal variability show maxima in summer in both hemispheres, but, in winter, only in the Northern Hemisphere. The geographical distributions of the long term mean, and the annual, quasibiennial and semiannual waves in total ozone over the Northern Hemisphere are presented. The extratropical amplitude of the annual wave is by far the largest of the three, as much as 120 m atm cm over northern Siberia. There is a tendency for all three waves to have maxima in high latitudes. Monthly means of the vertical distribution of ozone determined from 3 to 8 years of ozonesonde data over North America are presented. Number density is highest in the Arctic near 18 km. The region of maximum number density slopes upward toward 10 N, where the long term mean is 45 x 10 to the 11th power molecules cm/3 near 26 km.
Agent-based reasoning for distributed multi-INT analysis
NASA Astrophysics Data System (ADS)
Inchiosa, Mario E.; Parker, Miles T.; Perline, Richard
2006-05-01
Fully exploiting the intelligence community's exponentially growing data resources will require computational approaches differing radically from those currently available. Intelligence data is massive, distributed, and heterogeneous. Conventional approaches requiring highly structured and centralized data will not meet this challenge. We report on a new approach, Agent-Based Reasoning (ABR). In NIST evaluations, the use of ABR software tripled analysts' solution speed, doubled accuracy, and halved perceived difficulty. ABR makes use of populations of fine-grained, locally interacting agents that collectively reason about intelligence scenarios in a self-organizing, "bottom-up" process akin to those found in biological and other complex systems. Reproduction rules allow agents to make inferences from multi-INT data, while movement rules organize information and optimize reasoning. Complementary deterministic and stochastic agent behaviors enhance reasoning power and flexibility. Agent interaction via small-world networks - such as are found in nervous systems, social networks, and power distribution grids - dramatically increases the rate of discovering intelligence fragments that usefully connect to yield new inferences. Small-world networks also support the distributed processing necessary to address intelligence community data challenges. In addition, we have found that ABR pre-processing can boost the performance of commercial text clustering software. Finally, we have demonstrated interoperability with Knowledge Engineering systems and seen that reasoning across diverse data sources can be a rich source of inferences.
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process
McDaniel, Patrick Drew
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process Simon Byers,lorrie,davek,pdmcdan¡ @research.att.com July 15, 2003 Abstract Unauthorized copying of movies is a major concern for the motion picture industry. While unauthorized copies of movies have been distributed via video cassette and DVD
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process
McDaniel, Patrick Drew
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process Simon Byers Unauthorized copying of movies is a major concern for the motion picture industry. While unauthorized copies of movies have been distributed via portable physical media for some time, low-cost, high-bandwidth Internet
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process
Cranor, Lorrie Faith
Analysis of Security Vulnerabilities in the Movie Production and Distribution Process Simon Byers, 2003 Abstract Unauthorized copying of movies is a major con- cern for the motion picture industry. While unau- thorized copies of movies have been distributed via portable physical media for some time
Suitability of Sample Size for Identifying Distribution Function in Regional Frequency Analysis
Binaya Kumar MISHRA; Yasuto TACHIKAWA; Kaoru TAKARA
2007-01-01
Synopsis Estimation of environmental extremes using regional frequency analysis needs fitting of an appropriate frequency distribution function representing a homogeneous region. Best fit frequency distribution depends on the computed moment coefficients using either conventional method of moment or recently developed method of L-moment. These moment coefficients depend on the length of observed data record. This paper examines the deviation of
Grain size, size-distribution and dislocation structure from diffraction peak profile analysis
Gubicza, Jenõ
of the log-normal size distribution function, (iii) U and (iv) M, the density and the arrangement parameterGrain size, size-distribution and dislocation structure from diffraction peak profile analysis T been developed to such an extent that it can be applied as a powerful method for the characterization
DYNAMIC ANALYSIS OF MULTI-STEPPED, DISTRIBUTED PARAMETER ROTOR-BEARING SYSTEMS
S.-W. HONG; J.-H. PARK
1999-01-01
Exact solutions for a distributed parameter system are of great use for the physical understanding of the system or the sensitivity analysis and design of the system. However, exact or closed-form solutions for multi-stepped rotor-bearing systems with distributed parameters have been rarely investigated. The present paper proposes a modelling and analysis method to obtain exact solutions for multi-stepped rotor-bearing systems
Dynamic Analysis of Multi-Stepped Distributed Parameter Rotor-Bearing Systems
S.-W. Hong; J.-H. Park
1999-01-01
Exact solutions for a distributed parameter system are of great use for the physical understanding of the system or the sensitivity analysis and design of the system. However, exact or closed-form solutions for multi-stepped rotor-bearing systems with distributed parameters have been rarely investigated. The present paper proposes a modelling and analysis method to obtain exact solutions for multi-stepped rotor-bearing systems
NASA Astrophysics Data System (ADS)
Takada, Hiroyuki; Takahashi, N.; Kawaguchi, S.
2003-07-01
We analyze arrival time of air shower using Hirosaki AS Array. This array consists of 5 scintillation detectors with GPS antenna for arrival times. We use two analysis methods. One is that the number of air showers observed within short time windows is analized by using Poisson distribution. The other is that the arrival time difference of k-events serial air showers by using Erlang distribution. We report the results of analysis by these two algorithm.
D. M. Moore; B. G. Lees; S. M. Davey
1991-01-01
Decision tree analysis was used to predict the distribution of forest communities in an area on the south coast of New South\\u000a Wales, Australia. The analysis was carried out using a geographical information system environmental data base of those topographic\\u000a and geological variables thought to influence the distribution of vegetation and derived from cartographic sources. The resulting\\u000a maps of forest
Distributed sound for volumes: data analysis using distributed visualization and sonification
NASA Astrophysics Data System (ADS)
Minghim, Rosane; Salvador, Veridiana C.; Sousa Freitas, Bruno; Ferreira de Oliveira, Maria C.; Gustavo Nonato, Luis
2002-03-01
A number of different resources and a body of new technology has been empowering visualization applications. At the same time, supportive and mostly experimental techniques aimed at increasing the representation power and interpretability of complex data, such as sonification, are beginning to establish a foundation that can be used in real applications. This work presents an architecture and a corresponding prototype implementation of a visualization system that incorporates some of these research and technological aspects, such as visualization on the web, distributed visualization, and sonification. The current development of the prototype is presented, as well as its implications and planned improvements.
NASA Technical Reports Server (NTRS)
Schmeckpeper, K. R.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.
Complexity analysis of pipeline mapping problems in distributed heterogeneous networks
Lin, Ying; Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S
2009-04-01
Largescale scientific applications require using various system resources to execute complex computing pipelines in distributed networks to support collaborative research. System resources are typically shared in the Internet or over dedicated connections based on their location, availability, capability, and capacity. Optimizing the network performance of computing pipelines in such distributed environments is critical to the success of these applications. We consider two types of largescale distributed applications: (1) interactive applications where a single dataset is sequentially processed along a pipeline; and (2) streaming applications where a series of datasets continuously flow through a pipeline. The computing pipelines of these applications consist of a number of modules executed in a linear order in network environments with heterogeneous resources under different constraints. Our goal is to find an efficient mapping scheme that allocates the modules of a pipeline to network nodes for minimum endtoend delay or maximum frame rate. We formulate the pipeline mappings in distributed environments as optimization problems and categorize them into six classes with different optimization goals and mapping constraints: (1) Minimum Endtoend Delay with No Node Reuse (MEDNNR), (2) Minimum Endtoend Delay with Contiguous Node Reuse (MEDCNR), (3) Minimum Endtoend Delay with Arbitrary Node Reuse (MEDANR), (4) Maximum Frame Rate with No Node Reuse or Share (MFRNNRS), (5) Maximum Frame Rate with Contiguous Node Reuse and Share (MFRCNRS), and (6) Maximum Frame Rate with Arbitrary Node Reuse and Share (MFRANRS). Here, 'contiguous node reuse' means that multiple contiguous modules along the pipeline may run on the same node and 'arbitrary node reuse' imposes no restriction on node reuse. Note that in interactive applications, a node can be reused but its resource is not shared. We prove that MEDANR is polynomially solvable and the rest are NP-complete. MEDANR, where either contiguous or noncontiguous modules in the pipeline can be mapped onto the same node, is essentially the Maximum n-hop Shortest Path problem, and can be solved using a dynamic programming method. In MEDNNR and MFRNNRS, any network node can be used only once, which requires selecting the same number of nodes for onetoone onto mapping. We show its NP-completeness by reducing from the Hamiltonian Path problem. Node reuse is allowed in MEDCNR, MFRCNRS and MFRANRS, which are similar to the Maximum n-hop Shortest Path problem that considers resource sharing. We prove their NP-completeness by reducing from the Disjoint-Connecting-Path Problem and Widest path with the Linear Capacity Constraints problem, respectively.
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers
Henning, Maria Florencia; Sanchez, Susana; Bakas, Laura; Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata
2009-05-22
Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.
Studying bubble-particle interactions by zeta potential distribution analysis.
Wu, Chendi; Wang, Louxiang; Harbottle, David; Masliyah, Jacob; Xu, Zhenghe
2015-07-01
Over a decade ago, Xu and Masliyah pioneered an approach to characterize the interactions between particles in dynamic environments of multicomponent systems by measuring zeta potential distributions of individual components and their mixtures. Using a Zetaphoremeter, the measured zeta potential distributions of individual components and their mixtures were used to determine the conditions of preferential attachment in multicomponent particle suspensions. The technique has been applied to study the attachment of nano-sized silica and alumina particles to sub-micron size bubbles in solutions with and without the addition of surface active agents (SDS, DAH and DF250). The degree of attachment between gas bubbles and particles is shown to be a function of the interaction energy governed by the dispersion, electrostatic double layer and hydrophobic forces. Under certain chemical conditions, the attachment of nano-particles to sub-micron size bubbles is shown to be enhanced by in-situ gas nucleation induced by hydrodynamic cavitation for the weakly interacting systems, where mixing of the two individual components results in negligible attachment. Preferential interaction in complex tertiary particle systems demonstrated strong attachment between micron-sized alumina and gas bubbles, with little attachment between micron-sized alumina and silica, possibly due to instability of the aggregates in the shear flow environment. PMID:25731913
LARGE-SCALE NORMAL COORDINATE ANALYSIS ON DISTRIBUTED
Raghavan, Padma
structures at the atomic level. In particular, molecular vibrations at low temperature can be characterized of NCA include characterizing thermal stability of polymer materials (Fukui et al. 200) and assessing to take advantage of the LARGE-SCALE NORMAL COORDINATE ANALYSIS 409 The International Journal of High
Distributed representation as a principle for the analysis of cockpit information displays.
Zhang, J
1997-01-01
This article examines the representational properties of cockpit information displays from the perspective of distributed representations (Zhang & Norman, 1994). The basic idea is that the information needed for many tasks in a cockpit is distributed across the external information displays in the cockpit and the internal minds of the pilots. It is proposed that the relative distribution of internal and external information is the major factor of a display's representational efficiency. Several functionally equivalent but representationally different navigation displays are selected to illustrate how the principle of distributed representations is applied to the analysis of the representational efficiencies of cockpit information displays. PMID:11541073
Analysis of the tropospheric water distribution during FIRE 2
NASA Technical Reports Server (NTRS)
Westphal, Douglas L.
1993-01-01
The Penn State/NCAR mesoscale model, as adapted for use at ARC, was used as a testbed for the development and validation of cloud models for use in General Circulation Models (GCM's). This modeling approach also allows us to intercompare the predictions of the various cloud schemes within the same dynamical framework. The use of the PSU/NCAR mesoscale model also allows us to compare our results with FIRE-II (First International Satellite Cloud Climatology Project Regional Experiment) observations, instead of climate statistics. Though a promising approach, our work to date revealed several difficulties. First, the model by design is limited in spatial coverage and is only run for 12 to 48 hours at a time. Hence the quality of the simulation will depend heavily on the initial conditions. The poor quality of upper-tropospheric measurements of water vapor is well known and the situation is particularly bad for mid-latitude winter since the coupling with the surface is less direct than in summer so that relying on the model to spin-up a reasonable moisture field is not always successful. Though one of the most common atmospheric constituents, water vapor is relatively difficult to measure accurately, especially operationally over large areas. The standard NWS sondes have little sensitivity at the low temperatures where cirrus form and the data from the GOES 6.7 micron channel is difficult to quantify. For this reason, the goals of FIRE Cirrus II included characterizing the three-dimensional distribution of water vapor and clouds. In studying the data from FIRE Cirrus II, it was found that no single special observation technique provides accurate regional distributions of water vapor. The Raman lidar provides accurate measurements, but only at the Hub, for levels up to 10 km, and during nighttime hours. The CLASS sondes are more sensitive to moisture at low temperatures than are the NWS sondes, but the four stations only cover an area of two hundred kilometers on a side. The aircraft give the most accurate measurements of water vapor, but are limited in spatial and temporal coverage. This problem is partly alleviated by the use of the MAPS analyses, a four-dimensional data assimilation system that combines the previous 3-hour forecast with the available observations, but its upper-level moisture analyses are sometimes deficient because of the vapor measurement problem. An attempt was made to create a consistent four-dimensional description of the water vapor distribution during the second IFO by subjectively combining data from a variety of sources, including MAPS analyses, CLASS sondes, SPECTRE sondes, NWS sondes, GOES satellite analyses, radars, lidars, and microwave radiometers.
Southern Arizona riparian habitat: Spatial distribution and analysis
NASA Technical Reports Server (NTRS)
Lacey, J. R.; Ogden, P. R.; Foster, K. E.
1975-01-01
The objectives of this study were centered around the demonstration of remote sensing as an inventory tool and researching the multiple uses of riparian vegetation. Specific study objectives were to: (1) map riparian vegetation along the Gila River, San Simon Creek, San Pedro River, Pantano Wash, (2) determine the feasibility of automated mapping using LANDSAT-1 computer compatible tapes, (3) locate and summarize existing mpas delineating riparian vegetation, (4) summarize data relevant to Southern Arizona's riparian products and uses, (5) document recent riparian vegetation changes along a selected portion of the San Pedro River, (6) summarize historical changes in composition and distribution of riparian vegetation, and (7) summarize sources of available photography pertinent to Southern Arizona.
Quantitative analysis of inclusion distributions in hot pressed silicon carbide
Michael Paul Bakas
2012-12-01
ABSTRACT Depth of penetration measurements in hot pressed SiC have exhibited significant variability that may be influenced by microstructural defects. To obtain a better understanding regarding the role of microstructural defects under highly dynamic conditions; fragments of hot pressed SiC plates subjected to impact tests were examined. Two types of inclusion defects were identified, carbonaceous and an aluminum-iron-oxide phase. A disproportionate number of large inclusions were found on the rubble, indicating that the inclusion defects were a part of the fragmentation process. Distribution functions were plotted to compare the inclusion populations. Fragments from the superior performing sample had an inclusion population consisting of more numerous but smaller inclusions. One possible explanation for this result is that the superior sample withstood a greater stress before failure, causing a greater number of smaller inclusions to participate in fragmentation than in the weaker sample.
Completion report harmonic analysis of electrical distribution systems
Tolbert, L.M.
1996-03-01
Harmonic currents have increased dramatically in electrical distribution systems in the last few years due to the growth in non-linear loads found in most electronic devices. Because electrical systems have been designed for linear voltage and current waveforms; (i.e. nearly sinusoidal), non-linear loads can cause serious problems such as overheating conductors or transformers, capacitor failures, inadvertent circuit breaker tripping, or malfunction of electronic equipment. The U.S. Army Center for Public Works has proposed a study to determine what devices are best for reducing or eliminating the effects of harmonics on power systems typical of those existing in their Command, Control, Communication and Intelligence (C3I) sites.
Finite key analysis for symmetric attacks in quantum key distribution
Meyer, Tim; Kampermann, Hermann; Kleinmann, Matthias; Bruss, Dagmar
2006-10-15
We introduce a constructive method to calculate the achievable secret key rate for a generic class of quantum key distribution protocols, when only a finite number n of signals is given. Our approach is applicable to all scenarios in which the quantum state shared by Alice and Bob is known. In particular, we consider the six state protocol with symmetric eavesdropping attacks, and show that for a small number of signals, i.e., below n{approx}10{sup 4}, the finite key rate differs significantly from the asymptotic value for n{yields}{infinity}. However, for larger n, a good approximation of the asymptotic value is found. We also study secret key rates for protocols using higher-dimensional quantum systems.
An analysis of the Seasat Satellite Data Distribution System
NASA Technical Reports Server (NTRS)
Ferrari, A. J.; Renfrow, J. T.
1980-01-01
A computerized data distribution network for remote accessing of Seasat generated data is described. The service is intended as an experiment to determine user needs and operational abilities for utilizing on-line satellite generated oceanographic data. Synoptic weather observations are input to the U.S. Fleet Numerical Oceanographic Central for preparation and transfer to a PDP 11/60 central computer, from which all access trunks originate. The data available includes meteorological and sea-state information in the form of analyses and forecasts, and users are being monitored for reactions to the system design, data products, system operation, and performance evaluation. The system provides data on sea level and upper atmospheric pressure, sea surface temperature, wind magnitude and direction, significant wave heights, direction, and periods, and spectral wave data. Transmissions have a maximum rate of 1.1 kbit/sec over the telephone line.
Advanced analysis of metal distributions in human hair
Kempson, Ivan M.; Skinner, William M. (U. South Australia)
2008-06-09
A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.
Managing large-scale multi-voltage distribution system analysis
Walton, C.M.
1994-12-31
The challenge for electricity utilities in the 90`s to deliver ever more reliable service at reduced cost and with fewer technical staff is the driver towards the next generation of automated network analysis tools. The paper discusses the application of an Automatic Loss Minimiser (ALM) and Fault Study Package (FSP) under the control of a Sequence Processor to an existing high resolution graphics network analysis package. Automated, sorted management summaries enable limited resources and system automation to be directed at those networks with the poorest performance and/or the largest potential savings from reduced system losses. The impact regular automatic monitoring has on the quality of the database and the scope for integration of the modules with other System Automation initiatives is also considered.
Photoelastic analysis of stress distribution with different implant systems.
Pellizzer, Eduardo Piza; Carli, Rafael Imai; Falcón-Antenucci, Rosse Mary; Verri, Fellippo Ramos; Goiato, Marcelo Coelho; Villa, Luiz Marcelo Ribeiro
2014-04-01
The aim of this study was to evaluate stress distribution with different implant systems through photoelasticity. Five models were fabricated with photoelastic resin PL-2. Each model was composed of a block of photoelastic resin (10 × 40 × 45 mm) with an implant and a healing abutment: model 1, internal hexagon implant (4.0 × 10 mm; Conect AR, Conexão, São Paulo, Brazil); model 2, Morse taper/internal octagon implant (4.1 × 10 mm; Standard, Straumann ITI, Andover, Mass); model 3, Morse taper implant (4.0 × 10 mm; AR Morse, Conexão); model 4, locking taper implant (4.0 × 11 mm; Bicon, Boston, Mass); model 5, external hexagon implant (4.0 × 10 mm; Master Screw, Conexão). Axial and oblique load (45°) of 150 N were applied by a universal testing machine (EMIC-DL 3000), and a circular polariscope was used to visualize the stress. The results were photographed and analyzed qualitatively using Adobe Photoshop software. For the axial load, the greatest stress concentration was exhibited in the cervical and apical thirds. However, the highest number of isochromatic fringes was observed in the implant apex and in the cervical adjacent to the load direction in all models for the oblique load. Model 2 (Morse taper, internal octagon, Straumann ITI) presented the lowest stress concentration, while model 5 (external hexagon, Master Screw, Conexão) exhibited the greatest stress. It was concluded that Morse taper implants presented a more favorable stress distribution among the test groups. The external hexagon implant showed the highest stress concentration. Oblique load generated the highest stress in all models analyzed. PMID:22208909
Unraveling the distributed neural code of facial identity through spatiotemporal pattern analysis
Plaut, David C.
Unraveling the distributed neural code of facial identity through spatiotemporal pattern analysis mechanisms subserving this feat appears to elude traditional approaches to functional brain data analysis. The present study investigates the neural code of facial identity perception with the aim of ascertain- ing
Semi-Markov PEPA: Compositional Modelling and Analysis with Generally Distributed Actions
Bradley, Jeremy
Semi-Markov PEPA: Compositional Modelling and Analysis with Generally Distributed Actions Jeremy T as a PEPA model generates a Markov chain for analysis purposes, so semi-Markov PEPA produces a semi-Markov chain. We discuss how semi-Markov PEPA models are anal- ysed through Knottenbelt's semi-Markov DNAmaca
Optimal multiparameter analysis of source water distributions in the Southern Drake Passage
Griesel, Alexa
Optimal multiparameter analysis of source water distributions in the Southern Drake Passage Marina. The water composition derived from the OMP analysis is consistent with a scenario in which iron-rich shelf a heterogeneous distri- bution, with regions of high phytoplankton biomass occurring primarily in shallow waters
T&F Proofs: Not For Distribution 4 The Computer-Based Analysis of
Salway, Andrew
T&F Proofs: Not For Distribution 4 The Computer-Based Analysis of Narrative and Multimodality, this study advocates a novel computer-based approach to the analysis of narrative and multimodality col- lections of multimodal stories. Crucially, these patterns are to be extracted without the cost
An Analysis of Security Vulnerabilities in the Movie Production and Distribution Process
McDaniel, Patrick Drew
An Analysis of Security Vulnerabilities in the Movie Production and Distribution Process Simon: pdmcdan@research.att.com Phone: (973) 360-5721 Fax: (973) 360-8970 1 #12;An Analysis of Security in Movie Production ... Telecommunications Policy Abstract Unauthorized copying of movies is a major concern
Buyya, Rajkumar
1 Composition and On Demand Deployment of Distributed Brain Activity Analysis Application on Global are brain science and high-energy physics. The analysis of brain activity data gathered from the MEG and analyze brain functions and requires access to large-scale computational resources. The potential platform
Walter Mächtle
1999-01-01
Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same
Rapid Spatial Distribution Seismic Loss Analysis for Multistory Buildings
Deshmukh, Pankaj Bhagvatrao
2012-07-16
is representative of 50th percentile MCE (50%) and EQ 7 at IM=0.8g is representative of 68th percentile MCE. (IM = intensity measure). .......... 43 Figure 7 Hysteresis loops generated at the base of the column C5 from the selected critical earthquakes... variabilities in the loss model ........................... 65 Figure 16 SAC-LA3 Ductile structure (Maximum Loss Model). (a) Hazard recurrence relation; (b-1) drifts obtained from structural analysis; (b-2) drifts with combined uncertainty of modeling...
Power systems analysis for direct current (dc) distribution systems
Fleischer, K.; Munnings, R.S.
1996-09-01
Many standards, guidelines, etc., currently exist which provide guidance for dc power systems analysis. These documents are scattered throughout the industry (i.e., IEEE, UL, NEMA, GE, etc.), and primarily treat the subject as though hand calculations are being performed. It is the intent of this paper to provide guidance for performing computer aided dc power systems analyzes. This paper will cover load flow/voltage drop and short circuit calculations.
Distributed finite element analysis using a transputer network
NASA Technical Reports Server (NTRS)
Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy
1989-01-01
The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.
1985-01-01
A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A
2015-01-01
This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition.
Sensitivity Analysis of Distributed Soil Moisture Profiles by Active Distributed Temperature Sensing
NASA Astrophysics Data System (ADS)
Ciocca, F.; Van De Giesen, N.; Assouline, S.; Huwald, H.; Lunati, I.
2012-12-01
Monitoring and measuring the fluctuations of soil moisture at large scales in the filed remains a challenge. Although sensors based on measurement of dielectric properties such as Time Domain Reflectometers (TDR) and capacity-based probes can guarantee reasonable responses, they always operate on limited spatial ranges. On the other hand optical fibers, attached to a Distribute Temperature Sensing (DTS) system, can allow for high precision soil temperature measurements over distances of kilometers. A recently developed technique called Active DTS (ADTS) and consisting of a heat pulse of a certain duration and power along the metal sheath covering the optical fiber buried in the soil, has proven a promising alternative to spatially-limited probes. Two approaches have been investigated to infer distributed soil moisture profiles in the region surrounding the optic fiber cable by analyzing the temperature variations during the heating and the cooling phases. One directly relates the change of temperature to the soil moisture (independently measured) to develop specific calibration curve for the soil used; the other requires inferring the thermal properties and then obtaining the soil moisture by inversion of known relationships. To test and compare the two approaches over a broad range of saturation conditions a large lysimeter has been homogeneously filled with loamy soil and 52 meters of fiber optic cable have been buried in the shallower 0.8 meters in a double coil rigid structure of 15 loops along with a series of capacity-based sensors (calibrated for the soil used) to provide independent soil moisture measurements at the same depths of the optical fiber. Thermocouples have also been wrapped around the fiber to investigate the effects of the insulating cover surrounding the cable, and in between each layer in order to monitor heat diffusion at several centimeters. A high performance DTS has been used to measure the temperature along the fiber optic cable. Several soil moisture profiles have been generated in the lysimeter either varying the water table height or by wetting the soil from the top. The sensitivity of the ADTS method for heat pulses of different duration and power and ranges of spatial and temporal resolution are presented.
Determining fertilizer-induced NO emission ratio from soils by a statistical distribution model
Xiaoyuan Yan; Kunio Shimizu; Hajime Akimoto; Toshimasa Ohara
2003-01-01
To reduce uncertainties in the highly variable estimates of NO emission from N fertilizer, we compiled and analyzed available reports of field measurements on fertilizer-induced NO emission. Three statistical distribution models, lognormal, gamma and Weibull, were used to fit the observation data. Results show that while all three models fit the observation data statistically, the lognormal model is not applicable
Development of distribution system reliability and risk analysis models
NASA Astrophysics Data System (ADS)
Vismor, T. D.; Northcote-Green, J. E. D.; Kostyal, S. J.; Brooks, C. L.
1981-08-01
The two reliability models, their testing, and the modifications of a unified distribution planning model to calculate reliability indices are described. The historical reliability assessment model HISRAM is designed to suit most utilities. Four implementation levels with different input data requirements and output capabilities permit a utility to select a level appropriate to its needs. User-defined divisions, causes and output options further add to the program flexibility. A unique feature of HISRAM is the program generation of the appropriate outage reporting form following level selection and initialization. This allows the engineer to review data input requirements before field implementation. It has the capability of estimating component failure rates and restoration times upon provision of suitable input data. The predictive reliability assessment model PRAM uses continuity criteria, together with component failure rates and restoration times to calculate load point indices. System indices similar to those produced by HISRAM are also calculated. Varying degrees of detail for representing the protection system are available through three user-selected models. Both models were tested through application. The conclusions and recommendations of the entire project are included.
Motion synthesis and force distribution analysis for a biped robot.
Trojnacki, Maciej T; Zieli?ska, Teresa
2011-01-01
In this paper, the method of generating biped robot motion using recorded human gait is presented. The recorded data were modified taking into account the velocity available for robot drives. Data includes only selected joint angles, therefore the missing values were obtained considering the dynamic postural stability of the robot, which means obtaining an adequate motion trajectory of the so-called Zero Moment Point (ZMT). Also, the method of determining the ground reaction forces' distribution during the biped robot's dynamic stable walk is described. The method was developed by the authors. Following the description of equations characterizing the dynamics of robot's motion, the values of the components of ground reaction forces were symbolically determined as well as the coordinates of the points of robot's feet contact with the ground. The theoretical considerations have been supported by computer simulation and animation of the robot's motion. This was done using Matlab/Simulink package and Simulink 3D Animation Toolbox, and it has proved the proposed method. PMID:21761810
Alves, Nelson A; Rizzi, Leandro G
2015-01-01
Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...
NASA Technical Reports Server (NTRS)
Gurgiolo, Chris; Vinas, Adolfo F.
2009-01-01
This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.
Barua, N U; Bienemann, A S; Woolley, M; Wyatt, M J; Johnson, D; Lewis, O; Irving, C; Pritchard, G; Gill, S
2015-10-15
Mesencephalic astrocyte-derived neurotrophic factor (MANF) is a 20kDa human protein which has both neuroprotective and neurorestorative activity on dopaminergic neurons and therefore may have application for the treatment of Parkinson's Disease. The aims of this study were to determine the translational potential of convection-enhanced delivery (CED) of MANF for the treatment of PD by studying its distribution in porcine putamen and substantia nigra and to correlate histological distribution with co-infused gadolinium-DTPA using real-time magnetic resonance imaging. We describe the distribution of MANF in porcine putamen and substantia nigra using an implantable CED catheter system using co-infused gadolinium-DTPA to allow real-time MRI tracking of infusate distribution. The distribution of gadolinium-DTPA on MRI correlated well with immunohistochemical analysis of MANF distribution. Volumetric analysis of MANF IHC staining indicated a volume of infusion (Vi) to volume of distribution (Vd) ratio of 3 in putamen and 2 in substantia nigra. This study confirms the translational potential of CED of MANF as a novel treatment strategy in PD and also supports the co-infusion of gadolinium as a proxy measure of MANF distribution in future clinical studies. Further study is required to determine the optimum infusion regime, flow rate and frequency of infusions in human trials. PMID:26276514
Environmental distribution, analysis, and toxicity of organometal(loid) compounds.
Dopp, E; Hartmann, L M; Florea, A M; Rettenmeier, A W; Hirner, A V
2004-01-01
The biochemical modification of the metals and metalloids mercury, tin, arsenic, antimony, bismuth, selenium, and tellurium via formation of volatile metal hydrides and alkylated species (volatile and involatile) performs a fundamental role in determining the environmental processing of these elements. In most instances, the formation of such species increases the environmental mobility of the element, and can result in bioaccumulation in lipophilic environments. While inorganic forms of most of these compounds are well characterized (e.g., arsenic, mercury) and some of them exhibit low toxicity (e.g., tin, bismuth), the more lipid-soluble organometals can be highly toxic. Methylmercury poisoning (e.g., Minamata disease) and tumor development in rats after exposure to dimethylarsinic acid or tributyltin oxide are just some examples. Data on the genotoxicity (and the neurotoxicity) as well as the mechanisms of cellular action of organometal(loid) compounds are, however, scarce. Many studies have shown that the production of such organometal(loid) species is possible and likely whenever anaerobic conditions (at least on a microscale) are combined with available metal(loid)s and methyl donors in the presence of suitable organisms. Such anaerobic conditions can exist within natural environments (e.g., wetlands, pond sediments) as well as within anthropogenic environmental systems (e.g., waste disposal sites and sewage treatments plants). Some methylation can also take place under aerobic conditions. This article gives an overview about the environmental distribution of organometal(loid) compounds and the potential hazardous effects on animal and human health. Genotoxic effects in vivo and in vitro in particular are discussed. PMID:15239389
Differentiating cerebral lymphomas and GBMs featuring luminance distribution analysis
NASA Astrophysics Data System (ADS)
Yamasaki, Toshihiko; Chen, Tsuhan; Hirai, Toshinori; Murakami, Ryuji
2013-02-01
Differentiating lymphomas and glioblastoma multiformes (GBMs) is important for proper treatment planning. A number of works have been proposed but there are still some problems. For example, many works depend on thresholding a single feature value, which is susceptible to noise. Non-typical cases that do not get along with such simple thresholding can be found easily. In other cases, experienced observers are required to extract the feature values or to provide some interactions to the system, which is costly. Even if experts are involved, inter-observer variance becomes another problem. In addition, most of the works use only one or a few slice(s) because 3D tumor segmentation is difficult and time-consuming. In this paper, we propose a tumor classification system that analyzes the luminance distribution of the whole tumor region. The 3D MRIs are segmented within a few tens of seconds by using our fast 3D segmentation algorithm. Then, the luminance histogram of the whole tumor region is generated. The typical cases are classified by the histogram range thresholding and the apparent diffusion coefficients (ADC) thresholding. The non-typical cases are learned and classified by a support vector machine (SVM). Most of the processing elements are semi-automatic except for the ADC value extraction. Therefore, even novice users can use the system easily and get almost the same results as experts. The experiments were conducted using 40 MRI datasets (20 lymphomas and 20 GBMs) with non-typical cases. The classification accuracy of the proposed method was 91.1% without the ADC thresholding and 95.4% with the ADC thresholding. On the other hand, the baseline method, the conventional ADC thresholding, yielded only 67.5% accuracy.
Distribution of Modelling Spatial Processes Using Geostatistical Analysis
NASA Astrophysics Data System (ADS)
Grynyshyna-Poliuga, Oksana; Stanislawska, Iwona; Swiatek, Anna
The Geostatistical Analyst uses sample points taken at different locations in a landscape and creates (interpolates) a continuous surface. The Geostatistical Analyst provides two groups of interpolation techniques: deterministic and geostatistical. All methods rely on the similarity of nearby sample points to create the surface. Deterministic techniques use mathematical functions for interpolation. Geostatistics relies on both statistical and mathematical methods, which can be used to create surfaces and assess the uncertainty of the predictions. The first step in geostatistical analysis is variography: computing and modelling a semivariogram. A semivariogram is one of the significant functions to indicate spatial correlation in observations measured at sample locations. It is commonly represented as a graph that shows the variance in measure with distance between all pairs of sampled locations. Such a graph is helpful to build a mathematical model that describes the variability of the measure with location. Modeling of relationship among sample locations to indicate the variability of the measure with distance of separation is called semivariogram modelling. It is applied to applications involving estimating the value of a measure at a new location. Our work presents the analysis of the data following the steps as given below: identification of data set periods, constructing and modelling the empirical semivariogram for single location and using the Kriging mapping function as modelling of TEC maps in mid-latitude during disturbed and quiet days. Based on the semivariogram, weights for the kriging interpolation are estimated. Additional observations do, in general, not provide relevant extra information to the interpolation, because the spatial correlation is well described with the semivariogram. Keywords: Semivariogram, Kriging, modelling, Geostatistics, TEC
Taravat, Alireza; Oppelt, Natascha
2014-01-01
Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376
Taravat, Alireza; Oppelt, Natascha
2014-01-01
Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376
NASA Astrophysics Data System (ADS)
Chi, Se-Hwan
2013-05-01
The effects of specimen size on the compressive strength and Weibull modulus were investigated for nuclear graphite of different coke particle sizes: IG-110 and NBG-18 (average coke particle size for IG-110: 25 ?m, NBG-18: 300 ?m). Two types of cylindrical specimens, i.e., where the diameter to length ratio was 1:2 (ASTM C 695-91 type specimen, 1:2 specimen) or 1:1 (1:1 specimen), were prepared for six diameters (3, 4, 5, 10, 15, and 20 mm) and tested at room temperature (compressive strain rate: 2.08 × 10-4 s-1). Anisotropy was considered during specimen preparation for NBG-18. The results showed that the effects of specimen size appeared negligible for the compressive strength, but grade-dependent for the Weibull modulus. In view of specimen miniaturization, deviations from the ASTM C 695-91 specimen size requirements require an investigation into the effects of size for the grade of graphite of interest, and the specimen size effects should be considered for Weibull modulus determination.
Extending the LWS Data Environment: Distributed Data Processing and Analysis
NASA Technical Reports Server (NTRS)
Narock, Thomas
2005-01-01
The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data processing tools from within their software. This now allows the CoSEC community to take advantage of our services and also demonstrates another means of accessing our system.
Biomechanical Analysis of Force Distribution in Human Finger Extensor Mechanisms
Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu
2014-01-01
The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the “Principle of Minimum Total Potential Energy” is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576
Distributional Benefit Analysis of a National Air Quality Rule
Post, Ellen S.; Belova, Anna; Huang, Jin
2011-01-01
Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA) must perform environmental justice (EJ) reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA’s Heavy Duty Diesel (HDD) rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups’ baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well. PMID:21776207
Distributional benefit analysis of a national air quality rule.
Post, Ellen S; Belova, Anna; Huang, Jin
2011-06-01
Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA) must perform environmental justice (EJ) reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA's Heavy Duty Diesel (HDD) rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups' baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well. PMID:21776207
Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis
NASA Astrophysics Data System (ADS)
Singh, R.; Percivall, G.
2009-12-01
(note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service
Hyperdimensional Analysis of Amino Acid Pair Distributions in Proteins
Henriksen, Svend B.; Arnason, Omar; Söring, Jón; Petersen, Steffen B.
2011-01-01
Our manuscript presents a novel approach to protein structure analyses. We have organized an 8-dimensional data cube with protein 3D-structural information from 8706 high-resolution non-redundant protein-chains with the aim of identifying packing rules at the amino acid pair level. The cube contains information about amino acid type, solvent accessibility, spatial and sequence distance, secondary structure and sequence length. We are able to pose structural queries to the data cube using program ProPack. The response is a 1, 2 or 3D graph. Whereas the response is of a statistical nature, the user can obtain an instant list of all PDB-structures where such pair is found. The user may select a particular structure, which is displayed highlighting the pair in question. The user may pose millions of different queries and for each one he will receive the answer in a few seconds. In order to demonstrate the capabilities of the data cube as well as the programs, we have selected well known structural features, disulphide bridges and salt bridges, where we illustrate how the queries are posed, and how answers are given. Motifs involving cysteines such as disulphide bridges, zinc-fingers and iron-sulfur clusters are clearly identified and differentiated. ProPack also reveals that whereas pairs of Lys residues virtually never appear in close spatial proximity, pairs of Arg are abundant and appear at close spatial distance, contrasting the belief that electrostatic repulsion would prevent this juxtaposition and that Arg-Lys is perceived as a conservative mutation. The presented programs can find and visualize novel packing preferences in proteins structures allowing the user to unravel correlations between pairs of amino acids. The new tools allow the user to view statistical information and visualize instantly the structures that underpin the statistical information, which is far from trivial with most other SW tools for protein structure analysis. PMID:22174733
Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models
E. Weitzel and M. Hoeschele
2014-09-01
A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.
Rees, T.F.
1990-01-01
Photon correlation spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS. -from Author
Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.
2009-01-01
A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043
Analysis of the 3D distribution of stacked self-assembled quantum dots by electron tomography
2012-01-01
The 3D distribution of self-assembled stacked quantum dots (QDs) is a key parameter to obtain the highest performance in a variety of optoelectronic devices. In this work, we have measured this distribution in 3D using a combined procedure of needle-shaped specimen preparation and electron tomography. We show that conventional 2D measurements of the distribution of QDs are not reliable, and only 3D analysis allows an accurate correlation between the growth design and the structural characteristics. PMID:23249477
Analysis and synthesis of distributed-lumped-active networks by digital computer
NASA Technical Reports Server (NTRS)
1973-01-01
The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671
Particle size distribution analysis for the rapid detection of microbial infection of urine.
Dow, C S; France, A D; Khan, M S; Johnson, T
1979-01-01
The accuracy and practicality of particle size distribution analysis for rapid screening of urine specimens are assessed. Six hundred urines were subjected simultaneously to routine bacteriological examinations and particle size distribution analysis using a Coulter Counter (ZBI) linked to a C1000 Channelyzer. There was complete agreement in the results of 593 (98.8%) specimens. Characteristic profiles of various bacterial species in infected specimens were consistently obtained. This system can easily be linked to any existing computer reporting in a district hospital laboratory, and the results of negative specimens (70--80%) can be obtained within 5--10 minutes. PMID:376561
NASA Astrophysics Data System (ADS)
Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko
In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.
Shabani, Farzin; Kumar, Lalit
2014-01-01
Using CLIMEX and the Taguchi Method, a process-based niche model was developed to estimate potential distributions of Phoenix dactylifera L. (date palm), an economically important crop in many counties. Development of the model was based on both its native and invasive distribution and validation was carried out in terms of its extensive distribution in Iran. To identify model parameters having greatest influence on distribution of date palm, a sensitivity analysis was carried out. Changes in suitability were established by mapping of regions where the estimated distribution changed with parameter alterations. This facilitated the assessment of certain areas in Iran where parameter modifications impacted the most, particularly in relation to suitable and highly suitable locations. Parameter sensitivities were also evaluated by the calculation of area changes within the suitable and highly suitable categories. The low temperature limit (DV2), high temperature limit (DV3), upper optimal temperature (SM2) and high soil moisture limit (SM3) had the greatest impact on sensitivity, while other parameters showed relatively less sensitivity or were insensitive to change. For an accurate fit in species distribution models, highly sensitive parameters require more extensive research and data collection methods. Results of this study demonstrate a more cost effective method for developing date palm distribution models, an integral element in species management, and may prove useful for streamlining requirements for data collection in potential distribution modeling for other species as well. PMID:24722140
An analysis of the size distribution of Italian firms by age
NASA Astrophysics Data System (ADS)
Cirillo, Pasquale
2010-02-01
In this paper we analyze the size distribution of Italian firms by age. In other words, we want to establish whether the way that the size of firms is distributed varies as firms become old. As a proxy of size we use capital. In [L.M.B. Cabral, J. Mata, On the evolution of the firm size distribution: Facts and theory, American Economic Review 93 (2003) 1075-1090], the authors study the distribution of Portuguese firms and they find out that, while the size distribution of all firms is fairly stable over time, the distributions of firms by age groups are appreciably different. In particular, as the age of the firms increases, their size distribution on the log scale shifts to the right, the left tails becomes thinner and the right tail thicker, with a clear decrease of the skewness. In this paper, we perform a similar analysis with Italian firms using the CEBI database, also considering firms’ growth rates. Although there are several papers dealing with Italian firms and their size distribution, to our knowledge a similar study concerning size and age has not been performed yet for Italy, especially with such a big panel.
NASA Astrophysics Data System (ADS)
Ka?mierczak, Bartosz; Kotowski, Andrzej
2015-06-01
The paper is a methodological extension of the current description of maximum precipitation amounts on the basis of gamma, Gumbel, lognormal or Weibull distribution to newly developed theoretical distributions, namely, the two-(GED2) and three-parameter (GED3) generalized exponential distribution. The verification is carried out on the basis of meteorological data from the Wroclaw-Strachowice meteorological station of the Institute of Meteorology and Water Management from years 1960-2009.
Doris, E.; Krasko, V.A.
2012-10-01
State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.
Fuzzy Cluster Analysis of Regional City Multilevel Logistics Distribution Center Location Plan
Yong-chang Ren; Tao Xing; Qiang Quan; Guo-qiang Zhao
\\u000a Establish the cities multi-level logistics distribution center in the region to improve the flow efficiency and economic benefits,\\u000a enhance regional competitiveness, promote the rational allocation of regional resources and effective use has an important\\u000a role. This article use fuzzy clustering analysis method to study. First discusses the preparations of fuzzy clustering analysis.\\u000a includes the construction of influence Factors indicator system,
Pagonis, V; Kitis, G
2002-01-01
This paper explores the possibility of using commercial software for thermoluminescence glow curve deconvolution (GCD) analysis. The program PEAKFIT has been used to perform GCD analysis of complex glow curves of quartz and dosimetric materials. First-order TL peaks were represented successfully using the Weibull distribution function. Second-order and general-order TL peaks were represented accurately by using the Logistic asymmetric functions with varying symmetry parameters. Analytical expressions were derived for determining the energy E from the parameters of the Logistic asymmetric functions. The accuracy of these analytical expressions for E was tested for a wide variety of kinetic parameters and was found to be comparable to the commonly used expressions in the TL literature. The effectiveness of fit of the analytical functions used here was tested using the figure of merit (FOM) and was found to be comparable to the accuracy of recently published GCD expressions for first- and general-order kinetics. PMID:12382713
When the Tail Counts: The Advantage of Bilingualism Through the Ex-Gaussian Distribution Analysis
Calabria, Marco; Hernández, Mireia; Martin, Clara D.; Costa, Albert
2011-01-01
Several studies have documented the advantage of bilingualism with respect to the development of the executive control (EC) system. Two effects of bilingualism have been described in conflict resolution tasks: (a) bilinguals tend to perform the tasks faster overall, and (b) bilinguals tend to experience less interference from conflicting information, compared to monolinguals. The precise way in which the bilingual advantage relies on different EC mechanisms is still not well understood. The goal of the present article is to further explore how bilingualism impacts the EC system by performing a new analysis (Ex-Gaussian) of already reported data in which bilinguals and monolinguals performed a flanker task. Ex-Gaussian distribution analysis allows us to partial out the contribution of the normal and the exponential components of the RT distribution of the two groups. The fit of the raw data to the ex-Gaussian distribution showed two main results. First, we found that the bilingualism advantage in the overall speed of processing is captured by group differences in the normal (?) and the exponential (?) components of the distribution. Second, the bilingual advantage in the magnitude of the conflict effect is captured by group differences only in the exponential component. The results are discussed in terms of: (a) usefulness of the ex-Gaussian analysis as a tool to better describe the RT distribution, and (b) a new approach to explore the cognitive processes purportedly involved in instantiating the bilingualism advantage with respect to EC. PMID:22007182
Energy Consumption in Data Analysis for On-board and Distributed Applications
Kargupta, Hilol
Energy consumption is an important issue in the growing number of data mining and machine learning knowledge there does not exist a study on energy require- ments for data mining algorithms. BasedEnergy Consumption in Data Analysis for On-board and Distributed Applications Ruchita Bhargava
Aalborg Universitet Models for HLI analysis of power system with offshore wind farms and distributed
Bak-Jensen, Birgitte
Aalborg Universitet Models for HLI analysis of power system with offshore wind farms for Offshore Wind farms Publication date: 2008 Document Version Publisher final version (usually the publisher with offshore wind farms and distributed generation. In Proc. of 7th International Workshop on Large- Scale
Shlyakhter, Ilya
Uncertainty Analysis of Multiple Epidemiological Studies Using Frequency Distributions of Relative epidemiologic studies of the same outcome is suggested. A set of 95% confidence intervals for relative risk, RR is the evidence of elevated risk in observational studies. 1 Introduction Epidemiological results are often
GenoMap: A Distributed System for Unifying Genotyping and Genetic Linkage Analysis
Casavant, Tom
to be managed include: informative sets of polymorphic markers; databases of patient demographic, pedigreeGenoMap: A Distributed System for Unifying Genotyping and Genetic Linkage Analysis Todd E. Scheetz. of Electrical and Computer Engineering and the Dept. of Genetics University of Iowa genomap
Technology Transfer Automated Retrieval System (TEKTRAN)
Starch was isolated from flour of four wheats representing hard red winter (Karl), hard red spring (Gunner), durum (Belfield 3), and spelt (WK 86035-8) wheat classes. Digital image analysis (IA) coupled to light microscopy was used to determine starch size distributions where the volume of granules...
An investigation on the intra-sample distribution of cotton color by using image analysis
Technology Transfer Automated Retrieval System (TEKTRAN)
The colorimeter principle is widely used to measure cotton color. This method provides the sample’s color grade; but the result does not include information about the color distribution and any variation within the sample. We conducted an investigation that used image analysis method to study the ...
Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory
Sena, I.; Deppman, A.
2013-03-25
A systematic analysis of transverse momentum distribution of hadrons produced in ultrarelativistic p+p and A+A collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.
A Study of Optimal Mean Photon Number Analysis in Quantum Key Distribution (QKD)
L. I. A. Ghazali; W. A. W. Adnan; M. Mokhtar; A. F. Abas; M. A. Mahdi
2008-01-01
This paper presents a study of optimal mean photon number (mu) analysis in quantum key distribution (QKD) experiments. The rationale of this study is to analyze the eavesdropping technology assumptions regarding the optimal mean photon number. This work limits its scope only on the fiber-based QKD implementation.
A Convergence Analysis of Distributed Dictionary Learning Based on the K-SVD Algorithm
Bajwa, Waheed U.
for dictionary learning, termed cloud K-SVD [5]. Computationally, cloud K-SVD has been shown to have many main contribution, note that cloud K-SVD relies on the power method for computing dominant singular and Waheed U. Bajwa Abstract--This paper provides a convergence analysis of a recent distributed algorithm
Analysis of on-chip inductance effects for distributed RLC interconnects
Kaustav Banerjee; Amit Mehrotra
2002-01-01
This paper introduces an accurate analysis of on-chip inductance effects for distributed RLC interconnects that takes the effect of both the series resistance and the output parasitic capacitance of the driver into account. Using rigorous first principle calculations, accurate expressions for the transfer function of these lines and their time-domain response have been presented for the first time. Using these,
Short, Daniel
2007-01-01
Atmospheric Environment 41 (2007) 49084919 Particle size and composition distribution analysis of automotive brake abrasion dusts for the evaluation of antimony sources of airborne particulate matter Akihiro University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan c Akebono Brake Industry, Co., Ltd., 5
Pike, Linda J.
GFP was stably expressed in CHO cells and studied using fluorescence correlation spectroscopy and fluorescentOligomerization of the EGF Receptor Investigated by Live Cell Fluorescence Intensity Distribution Analysis Saveez Saffarian,* Yu Li,z Elliot L. Elson,y and Linda J. Pikey *Department of Cell Biology
Data Reduction for the Scalable Automated Analysis of Distributed Darknet Traffic
Jahanian, Farnam
Data Reduction for the Scalable Automated Analysis of Distributed Darknet Traffic Michael Bailey address blocks (or darknets) with forensic honeypots (or honeyfarms). In this paper we examine- brid systems. We show that individual darknets are dom- inated by a small number of sources repeating
Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks
Parker, Lynne E.
Distributed Sensor Analysis for Fault Detection in Tightly-Coupled Multi-Robot Team Tasks Xingyan Li and Lynne E. Parker Proc. of IEEE International Conference on Robotics and Automation, Kobe, Japan multi-robot team tasks. While the centralized version of SAFDetection was shown to be successful
Analysis of steep-fronted voltage distribution and turn insulation failure in inverter fed AC motor
Yifan Tang
1997-01-01
As a potential source of turn insulation failure, the steep-fronted voltage at the AC motor terminal generated by PWM drive switching or cable ringing is not uniformly distributed among the coils of the windings and among the turns of the coils. This paper presents a detailed electromagnetic analysis of the parameters of a transient model of coil and coil group
Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis
Gräter, Frauke
crystals, opening up the road to predict full fiber mechanics. INTRODUCTION Silk proteins build upMechanical Response of Silk Crystalline Units from Force-Distribution Analysis Senbo Xiao, Wolfram of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk
M. Prado; J. Reina-Tosina; L. Roa
2002-01-01
A novel approach for the detection of falls, the analysis of body postures, mobility and metabolic energy expenditure of elderly people has been developed. It is based on a distributed intelligence architecture, supported by it wireless personal area network (WPAN) which allows a full 24-hour supervision of the user, both indoor and outdoor home. An open design methodology lets the
MAINE GAP ANALYSIS VERTEBRATE DATA -PART II: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF
Boone, Randall B.
MAINE GAP ANALYSIS VERTEBRATE DATA - PART II: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF BREEDING BIRDS IN MAINE Randall B. Boonea Department of Wildlife Ecology and Maine Cooperative Fish and Wildlife Research Unit University of Maine, Orono, ME 04469-5755 and William B. Krohn USGS Biological
MAINE GAP ANALYSIS VERTEBRATE DATA -PART I: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF
Boone, Randall B.
MAINE GAP ANALYSIS VERTEBRATE DATA - PART I: DISTRIBUTION, HABITAT RELATIONS, AND STATUS OF AMPHIBIANS, REPTILES AND MAMMALS IN MAINE Randall B. Boonea Department of Wildlife Ecology and Maine Cooperative Fish and Wildlife Research Unit University of Maine, Orono, ME 04469-5755 and William B. Krohn
Analysis of the Arrival Time of Succesive Air Showers by using Erlang Distribution
NASA Astrophysics Data System (ADS)
Kudo, M.; Takahashi, N.
We analyze arrival time of air shower using Hirosaki AS Arrays. This array consists of 5 scintillation detectors with GPS antenna for arrival times. We use Erlang Distribution. The number of air showers observed within short time windows is analized by using arrival time difference of k-events serial air showers. We report the results of the analysis.
Basic Block Distribution Analysis to Find Periodic Behavior and Simulation Points in Applications
Timothy Sherwood; Erez Perelman; Brad Calder
2001-01-01
Abstract: Modern architecture research relies heavily on detailed pipeline simulation. Simulating the full execution of an industry standard benchmark can take weeks to months to complete. To overcome this problem researchers choose a very small portion of a program's execution to evaluate their results, rather than simulating the entire program. In this paper we propose Basic Block Distribution Analysis as
Holistic scheduling and analysis of mixed time\\/event-triggered distributed embedded systems
Traian Pop; Petru Eles; Zebo Peng
2002-01-01
This paper deals with specific issues related to the design of distributed embedded systems implemented with mixed, event-triggered and time-triggered task sets, which communicate over bus protocols consisting of both static and dynamic phases. Such systems are emerging as the new standard for automotive applications. We have developed a holistic timing analysis and scheduling approach for this category of systems.
SPE 167844 Geographically-Distributed Databases: A Big Data Technology for Production Analysis advances in the scientific field of "big-data" to the world of Oil & Gas upstream industry. These off-of-the-start IT technologies currently employed in the data management of Oil & Gas production operations. Most current
Exploratory Data Analysis to Identify Factors Influencing Spatial Distributions of Weed Seed Banks
Technology Transfer Automated Retrieval System (TEKTRAN)
Comparing distributions of different species in multiple fields will help us understand the spatial dynamics of weed seed banks, but analyzing observational data requires non-traditional statistical methods. We used classification and regression tree analysis (CART) to investigate factors that influ...
Stone, J. V.
Statistical analysis of the distribution of gold particles over antigen sites after immunogold labelling C.A. Glasbey Biomathematics and Statistics Scotland JCMB, King's Buildings, Edinburgh, EH9 3JZ antibodies are attached are assumed to follow a Poisson process and one gold particle sits on top of each
Complex Wishart Distribution Based Analysis of Polarimetric Synthetic Aperture Radar Data
Complex Wishart Distribution Based Analysis of Polarimetric Synthetic Aperture Radar Data Allan A, Denmark, Email: kc@imm.dtu.dk Abstract--Multi-look, polarimetric synthetic aperture radar (SAR) data, cloud cover, synthetic aperture radar (SAR) data hold a strong potential for e.g. change detection
MIXTURE PRINCIPAL COMPONENT ANALYSIS FOR DISTRIBUTION VOLUME PARAMETRIC IMAGING IN BRAIN PET STUDIES
Liu, K. J. Ray
MIXTURE PRINCIPAL COMPONENT ANALYSIS FOR DISTRIBUTION VOLUME PARAMETRIC IMAGING IN BRAIN PET tomography (PET) data in brain studies. The parameters of the probabilistic mixture model are determined and the input function. The efficiency and superiority of the proposed scheme is demonstrated by real brain PET
The Asymptotic Normal Distribution of Estimators in Factor Analysis under General Conditions
T. W. Anderson; Yasuo Amemiya
1988-01-01
Asymptotic properties of estimators for the confirmatory factor analysis model are discussed. The model is identified by restrictions on the elements of the factor loading matrix; the number of restrictions may exceed that required for identification. It is shown that a particular centering of the maximum likelihood estimator derived under assumed normality of observations yields an asymptotic normal distribution that
Estimating Flexible Distributions of Ideal-Points with External Analysis of Preferences.
ERIC Educational Resources Information Center
Kamakura, Wagner A.
1991-01-01
Two ideal-point probabilistic choice models from the external analysis of preferences are presented that allow for more flexible distributions of ideal-points. The first extends the probit model of W. Kamakura and R. Srivastava. The second is based on simplifying assumptions that lead to a multidimensional histogram of ideal-points. (SLD)
Postmortem Analysis of Neuron Distributions in the Locus Coeruleus of Alcoholics and Suicidal of the human locus coeruleus (LC) in four groups of subjects: controls, suicidal nonÂalcoholics, nonÂsuicidal alcoholics, and suicidal alcoholics. The data consist of postmortem neuron measurements, counts, and spatial
Automated local bright feature image analysis of nuclear protein distribution identifies
Knowles, David William
the cell nucleus appears to play a central role in directing nuclear functions necessary for cellAutomated local bright feature image analysis of nuclear protein distribution identifies changes March 3, 2005) The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells
Peter Schuck
2000-01-01
A new method for the size-distribution analysis of polymers by sedimentation velocity analytical ultracentrifugation is described. It exploits the ability of Lamm equation modeling to discriminate between the spreading of the sedimentation boundary arising from sample heterogeneity and from diffusion. Finite element solutions of the Lamm equation for a large number of discrete noninteracting species are combined with maximum entropy
Analysis of multiconfigurational wave functions in terms of hole-particle distributions
Analysis of multiconfigurational wave functions in terms of hole-particle distributions A. V 2005; accepted 19 April 2006; published online 14 June 2006 A detailed study of hole-particle system of holes and particles. Particular attention is focused on the description of mixed hole-particle
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.
1991-01-01
Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.
Mächtle, W
1999-02-01
Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same problems are faced in the polymer industry, where dispersions of latices, pigments, lacquers, and emulsions must be characterized. There is a rich history of methods developed for the polymer industry finding use in the biochemical sciences. Two such methods are presented. These use analytical ultracentrifugation to determine the density and size distributions for submicron-sized particles. Both methods rely on Stokes' equations to estimate particle size and density, whereas turbidity, corrected using Mie's theory, provides the concentration measurement. The first method uses the sedimentation time in dispersion media of different densities to evaluate the particle density and size distribution. This method works provided the sample is chemically homogeneous. The second method splices together data gathered at different sample concentrations, thus permitting the high-resolution determination of the size distribution of particle diameters ranging from 10 to 3000 nm. By increasing the rotor speed exponentially from 0 to 40,000 rpm over a 1-h period, size distributions may be measured for extremely broadly distributed dispersions. Presented here is a short history of particle size distribution analysis using the ultracentrifuge, along with a description of the newest experimental methods. Several applications of the methods are provided that demonstrate the breadth of its utility, including extensions to samples containing nonspherical and chromophoric particles. PMID:9916040
Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model
NASA Astrophysics Data System (ADS)
Berezowski, T.; Nossent, J.; Chorma?ski, J.; Batelaan, O.
2015-04-01
As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly increasing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis method for spatial input data (snow cover fraction - SCF) for a distributed rainfall-runoff model to investigate when the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focussed on the relation between the SCF sensitivity and the physical and spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland, for which a distributed WetSpa model is set up to simulate 2 years of daily runoff. The sensitivity analysis uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which employs different response functions for each spatial parameter representing a 4 × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as geomorphology, soil texture, land use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for our spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model. The developed method can be easily applied to other models and other spatial data.
[Computer-analysis of volume distribution curves of erythrocytes (author's transl)].
Schäfer, A; Schäfer, W
1980-02-01
The erythrocyte diameter curve of Price-Jones exhibits symmetrical, normal distribution. In contrast, the volume distribution curve of erythrocytes shows an asymmetrical course with some skewness to the right, which with different haematological diseases may vary. With respect to differential diagnosis as well as to therapy it is of importance to have available an objective means of comparing curves with different shapes. We therefore developed a mathematical determination of the volume distribution curves, which derived from the overlapping of two Gaussian normal distribution curves. Volume distribution curves of erythrocytes were determined with an electronic particle counter (Coulter counter) in 271 healthy and haematologically affected children as well as in 3 adults. With only one exception all volume distribution curves of erythrocytes could be fitted using computer analysis and the constants m1, m2, s1 and s2 were calculated. Characterisation of the curves with only these 4 constants allows a simple qualitative as well as quantitative comparison of different volume distribution curves. PMID:7366129
Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline
Dinov, Ivo D.; Van Horn, John D.; Lozev, Kamen M.; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; MacKenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S.; Toga, Arthur W.
2009-01-01
The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications, documentation and usage are available online (http://Pipeline.loni.ucla.edu). PMID:19649168
Study of Solid State Drives performance in PROOF distributed analysis system
NASA Astrophysics Data System (ADS)
Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.
2010-04-01
Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.
Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe
Gaite, José
2010-03-01
We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions.
Spatial Latent Class Analysis Model for Spatially Distributed Multivariate Binary Data
Wall, Melanie M.; Liu, Xuan
2009-01-01
A spatial latent class analysis model that extends the classic latent class analysis model by adding spatial structure to the latent class distribution through the use of the multinomial probit model is introduced. Linear combinations of independent Gaussian spatial processes are used to develop multivariate spatial processes that are underlying the categorical latent classes. This allows the latent class membership to be correlated across spatially distributed sites and it allows correlation between the probabilities of particular types of classes at any one site. The number of latent classes is assumed fixed but is chosen by model comparison via cross-validation. An application of the spatial latent class analysis model is shown using soil pollution samples where 8 heavy metals were measured to be above or below government pollution limits across a 25 square kilometer region. Estimation is performed within a Bayesian framework using MCMC and is implemented using the OpenBUGS software. PMID:20161235
Liu, D L; Waag, R C
1997-02-01
The amplitude characteristics of ultrasonic wavefront distortion produced by transmission through the abdominal wall and breast is described. Ultrasonic pulses were recorded in a two-dimensional aperture after transmission through specimens of abdominal wall or breast. After the pulse arrival times were corrected for geometric path differences, the pulses were temporally Fourier transformed and two-dimensional maps of harmonic amplitudes in the measurement aperture were computed. The results indicate that, as the temporal frequency increases, the fluctuation in harmonic amplitudes increases but the spatial scale of the fluctuation decreases. The normalized second-order and third-order moments of the amplitude distribution also increase with temporal frequency. The wide range variation of these distribution characteristics could not be covered by the Rayleigh, Rician, or K-distribution because of their limited flexibility. However, the Weibull distribution and especially the generalized K-distribution provide better fits to the data. In the fit of the generalized K-distribution, a decrease of its parameter alpha with increasing temporal frequency was observed, as predicted by analysis based on a phase screen model. PMID:9035403
Evolution of the ATLAS PanDA Production and Distributed Analysis System
NASA Astrophysics Data System (ADS)
Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Walker, R.; Stradling, A.; Fine, V.; Potekhin, M.; Panitkin, S.; Compostella, G.
2012-12-01
The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDA has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.
Water distribution system vulnerability analysis using weighted and directed network models
NASA Astrophysics Data System (ADS)
Yazdani, Alireza; Jeffrey, Paul
2012-06-01
The reliability and robustness against failures of networked water distribution systems are central tenets of water supply system design and operation. The ability of such networks to continue to supply water when components are damaged or fail is dependent on the connectivity of the network and the role and location of the individual components. This paper employs a set of advanced network analysis techniques to study the connectivity of water distribution systems, its relationship with system robustness, and susceptibility to damage. Water distribution systems are modeled as weighted and directed networks by using the physical and hydraulic attributes of system components. A selection of descriptive measurements is utilized to quantify the structural properties of benchmark systems at both local (component) and global (network) scales. Moreover, a novel measure of component criticality, the demand-adjusted entropic degree, is proposed to support identification of critical nodes and their ranking according to failure impacts. The application and value of this metric is demonstrated through two case study networks in the USA and UK. Discussion focuses on the potential for gradual evolution of abstract graph-based tools and techniques to more practical network analysis methods, where a theoretical framework for the analysis of robustness and vulnerability of water distribution networks to better support planning and management decisions is presented.
Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model
NASA Astrophysics Data System (ADS)
Berezowski, T.; Nossent, J.; Chorma?ski, J.; Batelaan, O.
2014-10-01
As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly growing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis (SA) method for snow cover fraction input data (SCF) for a distributed rainfall-runoff model to investigate if the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focused on the relation between the SCF sensitivity and the physical, spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland for which a distributed WetSpa model is setup to simulate two years of daily runoff. The SA uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which uses different response functions for each 4 km × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as: geomorphology, soil texture, land-use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for the spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model.
1 Abstract-- Distribution factors play a key role in many system security analysis and market applications. The injection shift factors (ISFs) are the basic factors that serve as building blocks of the other distribution factors. The line outage distribution factors (LODFs) may be computed using the ISFs
Holte, Robert
Empirical analysis of the rank distribution of relevant documents in web search Shen Jiang, Sandra the rank distribution of relevant documents in web search. From a methodological point of view a new about the actual rank distribution of relevant documents in web search, with several consequences
Analysis of large-scale distributed knowledge sources via autonomous cooperative graph mining
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Ortiz, Andres; Yan, Xifeng
2014-05-01
In this paper, we present a model for processing distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures, and communication bottlenecks. Our model exploits dependencies in the data to enable collaborative distributed querying in noisy data. The collaboration policy for computational resources is efficiently constructed from the belief propagation algorithm. To scale to large data sizes, we employ a combination of priority-based filtering, incremental processing, and communication compression techniques. Our solution achieved high accuracy of analysis results and orders of magnitude improvements in computation time compared to the centralized graph matching solution.
NASA Astrophysics Data System (ADS)
Piegari, E.; Di Maio, R.; Avella, A.
2013-12-01
Reasonable prediction of landslide occurrences in a given area requires the choice of an appropriate probability distribution of recurrence time intervals. Although landslides are widespread and frequent in many parts of the world, complete databases of landslide occurrences over large periods are missing and often such natural disasters are treated as processes uncorrelated in time and, therefore, Poisson distributed. In this paper, we examine the recurrence time statistics of landslide events simulated by a cellular automaton model that reproduces well the actual frequency-size statistics of landslide catalogues. The complex time series are analysed by varying both the threshold above which the time between events is recorded and the values of the key model parameters. The synthetic recurrence time probability distribution is shown to be strongly dependent on the rate at which instability is approached, providing a smooth crossover from a power-law regime to a Weibull regime. Moreover, a Fano factor analysis shows a clear indication of different degrees of correlation in landslide time series. Such a finding supports, at least in part, a recent analysis performed for the first time of an historical landslide time series over a time window of fifty years.
Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models
Weitzel, E.; Hoeschele, M.
2014-09-01
A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.
Chamieh, Joseph; Martin, Michel; Cottet, Hervé
2015-01-20
Quantitative analysis in capillary electrophoresis based on time-scale electropherograms generally uses time-corrected peak areas to account for the differences in apparent velocities between solutes. However, it could be convenient and much more relevant to change the time-scale electropherograms into mass relative distribution of the effective mobility or any other characteristic parameter (molar mass, chemical composition, charge density, ...). In this study, the theoretical background required to perform the variable change on the electropherogram was developed with an emphasis on the fact that both x and y axes should be changed when the time scale electropherograms are modified to get the distributions. Applications to the characterization of polymers and copolymers by different modes of capillary electrophoresis (CE) are presented, including the molar mass distribution of poly-L-lysine oligomers by capillary gel electrophoresis (CGE), molar mass distribution of end-charged poly-l-alanine by free solution CE, molar mass distribution of evenly charged polyelectrolytes by CGE, and charge density distribution of variously charged polyelectrolytes by free solution CE. PMID:25569334
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
SpatTrack: an imaging toolbox for analysis of vesicle motility and distribution in living cells.
Lund, Frederik W; Jensen, Maria Louise V; Christensen, Tanja; Nielsen, Gitte K; Heegaard, Christian W; Wüstner, Daniel
2014-12-01
The endocytic pathway is a complex network of highly dynamic organelles, which has been traditionally studied by quantitative fluorescence microscopy. The data generated by this method can be overwhelming and its analysis, even for the skilled microscopist, is tedious and error-prone. We developed SpatTrack, an open source, platform-independent program collecting a variety of methods for analysis of vesicle dynamics and distribution in living cells. SpatTrack performs 2D particle tracking, trajectory analysis and fitting of diffusion models to the calculated mean square displacement. It allows for spatial analysis of detected vesicle patterns including calculation of the radial distribution function and particle-based colocalization. Importantly, all analysis tools are supported by Monte Carlo simulations of synthetic images. This allows the user to assess the reliability of the analysis and to study alternative scenarios. We demonstrate the functionality of SpatTrack by performing a detailed imaging study of internalized fluorescence-tagged Niemann Pick C2 (NPC2) protein in human disease fibroblasts. Using SpatTrack, we show that NPC2 rescued the cholesterol-storage phenotype from a subpopulation of late endosomes/lysosomes (LE/LYSs). This was paralleled by repositioning and active transport of NPC2-containing vesicles to the cell surface. The potential of SpatTrack for other applications in intracellular transport studies will be discussed. PMID:25243614
Validation results of the IAG Dancer project for distributed GPS analysis
NASA Astrophysics Data System (ADS)
Boomkamp, H.
2012-12-01
The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and demonstrates the obvious benefits of distributed analysis of geodetic data in general. IAG Dancer screenshot
Sub-population analysis of deformability distribution in heterogeneous red blood cell population.
Lee, Dong Woo; Doh, Il; Kuypers, Frans A; Cho, Young-Ho
2015-12-01
We present a method for sub-population analysis of deformability distribution using single-cell microchamber array (SiCMA) technology. It is a unique method allowing the correlation of overall cellular characteristics with surface and cytosolic characteristics to define the distribution of individual cellular characteristics in heterogeneous cell populations. As a proof of principle, reticulocytes, the immature sub-population of red blood cells (RBC), were recognized from RBC population by a surface marker and different characteristics on deformability between these populations were characterized. The proposed technology can be used in a variety of applications that would benefit from the ability to measure the distribution of cellular characteristics in complex populations, especially important to define hematologic disorders. PMID:26383009
NASA Astrophysics Data System (ADS)
Schreiner, L. J.; Holmes, O.; Salomons, G.
2013-06-01
One component of clinical treatment validation, for example in the commissioning of new radiotherapy techniques or in patient specific quality assurance, is the evaluation and verification of planned and delivered dose distributions. Gamma and related tests (such as the chi evaluation) have become standard clinical tools for such work. Both functions provide quantitative comparisons between dose distributions, combining dose difference and distance to agreement criteria. However, there are some practical considerations in their utilization that can compromise the integrity of the tests, and these are occasionally overlooked especially when the tests are too readily adopted from commercial software. In this paper we review the evaluation tools and describe some practical concerns. The intent is to provide users with some guidance so that their use of these evaluations will provide valid rapid analysis and visualization of the agreement between planned and delivered dose distributions.
Principal Effects of Axial Load on Moment-Distribution Analysis of Rigid Structures
NASA Technical Reports Server (NTRS)
James, Benjamin Wylie
1935-01-01
This thesis presents the method of moment distribution modified to include the effect of axial load upon the bending moments. This modification makes it possible to analyze accurately complex structures, such as rigid fuselage trusses, that heretofore had to be analyzed by approximate formulas and empirical rules. The method is simple enough to be practicable even for complex structures, and it gives a means of analysis for continuous beams that is simpler than the extended three-moment equation now in common use. When the effect of axial load is included, it is found that the basic principles of moment distribution remain unchanged, the only difference being that the factors used, instead of being constants for a given member, become functions of the axial load. Formulas have been developed for these factors, and curves plotted so that their applications requires no more work than moment distribution without axial load. Simple problems have been included to illustrate the use of the curves.
Intrinsic charm parton distribution functions from CTEQ-TEA global analysis
NASA Astrophysics Data System (ADS)
Dulat, Sayipjamal; Hou, Tie-Jiun; Gao, Jun; Huston, Joey; Pumplin, Jon; Schmidt, Carl; Stump, Daniel; Yuan, C.-P.
2014-04-01
We study the possibility of intrinsic (nonperturbative) charm in parton distribution functions (PDF) of the proton, within the context of the CT10 next-to-next-to-leading order global analysis. Three models for the intrinsic charm (IC) quark content are compared: (i) c^(x)=0 (zero-IC model); (ii) c^(x) is parametrized by a valence-like parton distribution (BHPS model); (iii) c^(x) is parametrized by a sea-like parton distribution (SEA model). In these models, the intrinsic charm content, c^(x), is included in the charm PDF at the matching scale Qc=mc=1.3 GeV. The best fits to data are constructed and compared. Correlations between the value of mc and the amount of IC are also considered.
Lanza, L G; Stagi, L
2012-01-01
The analysis of counting and catching errors of both catching and non-catching types of rain intensity gauges was recently possible over a wide variety of measuring principles and instrument design solutions, based on the work performed during the recent Field Intercomparison of Rainfall Intensity Gauges promoted by World Meteorological Organization (WMO). The analysis reported here concerns the assessment of accuracy and precision of various types of instruments based on extensive calibration tests performed in the laboratory during the first phase of this WMO Intercomparison. The non-parametric analysis of relative errors allowed us to conclude that the accuracy of the investigated RI gauges is generally high, after assuming that it should be at least contained within the limits set forth by WMO in this respect. The measuring principle exploited by the instrument is generally not very decisive in obtaining such good results in the laboratory. Rather, the attention paid by the manufacturer to suitably accounting and correcting for systematic errors and time-constant related effects was demonstrated to be influential. The analysis of precision showed that the observed frequency distribution of relative errors around their mean value is not indicative of an underlying Gaussian population, being much more peaked in most cases than can be expected from samples extracted from a Gaussian distribution. The analysis of variance (one-way ANOVA), assuming the instrument model as the only potentially affecting factor, does not confirm the hypothesis of a single common underlying distribution for all instruments. Pair-wise multiple comparison analysis revealed cases in which significant differences could be observed. PMID:22546787
Development of a Web Service for Analysis in a Distributed Network
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes. PMID:25848586
Passive-scheme analysis for solving the untrusted source problem in quantum key distribution
Peng Xiang; Xu Bingjie; Guo Hong
2010-04-15
As a practical method, the passive scheme is useful to monitor the photon statistics of an untrusted source in a 'Plug and Play' quantum key distribution (QKD) system. In a passive scheme, three kinds of monitor mode can be adopted: average photon number (APN) monitor, photon number analyzer (PNA), and photon number distribution (PND) monitor. In this paper, the security analysis is rigorously given for the APN monitor, while for the PNA, the analysis, including statistical fluctuation and random noise, is addressed with a confidence level. The results show that the PNA can achieve better performance than the APN monitor and can asymptotically approach the theoretical limit of the PND monitor. Also, the passive scheme with the PNA works efficiently when the signal-to-noise ratio (R{sup SN}) is not too low and so is highly applicable to solve the untrusted source problem in the QKD system.
Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants
Simion, G.P.; VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B.; Bulmahn, K.D.
1993-06-01
This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.
Quinlan, D; Barany, G; Panas, T
2007-08-30
Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.
A new parallel-vector finite element analysis software on distributed-memory computers
NASA Technical Reports Server (NTRS)
Qin, Jiangning; Nguyen, Duc T.
1993-01-01
A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.
Anami, Lilian Costa; da Costa Lima, Júlia Magalhães; Takahashi, Fernando Eidi; Neisser, Maximiliano Piero; Noritomi, Pedro Yoshito; Bottino, Marco Antonio
2015-04-01
The goal of this study was to evaluate the distribution of stresses generated around implants with different internal-cone abutments by photoelastic (PA) and finite element analysis (FEA). For FEA, implant and abutments with different internal-cone connections (H- hexagonal and S- solid) were scanned, 3D meshes were modeled and objects were loaded with computer software. Trabecular and cortical bones and photoelastic resin blocks were simulated. The PA was performed with photoelastic resin blocks where implants were included and different abutments were bolted. Specimens were observed in the circular polariscope with the application device attached, where loads were applied on same conditions as FEA. FEA images showed very similar stress distribution between two models with different abutments. Differences were observed between stress distribution in bone and resin blocks; PA images resembled those obtained on resin block FEA. PA images were also quantitatively analyzed by comparing the values assigned to fringes. It was observed that S abutment distributes loads more evenly to bone adjacent to an implant when compared to H abutment, for both analysis methods used. It was observed that the PA has generated very similar results to those obtained in FEA with the resin block. PMID:23750560
Mahmoud, Seedahmed; Fang, Qiang; Cosic, Irena; Hussain, Zahir
2005-01-01
Recently we have studied the effects of extremely low frequency pulsed electromagnetic fields (ELF-PEMF) on the human biosignals. Electrocardiogram (ECG) and electroencephalogram (EEG) of seventeen healthy volunteers before and after the electromagnetic (EMF) exposure were recorded and analyzed. The root mean square (RMS) values of the recoded data were considered as comparison criteria. EEG results were shown that there were small variations in the brain electrical activity before and after exposure. The ECG power level was increased up to 1% for most of the subjects. In this paper, we further investigate the effects of the ELF-PEMF on the ECG signal using the hyperbolic T-distributions (HTD). This distribution was shown to be suitable for efficient amplitude and instantaneous frequency (IF) estimation of monoand multicomponent FM signals. In this work, we introduce this distribution to the analysis of ECG signals. The simulation results show that the HTD have a good performance in the analysis of the ECG signals comparing with the Choi-Williams distribution (CWD). Moreover, the results show that there are small shift in the frequency-domain signal before and after EMF exposure. PMID:17282314
A landscape analysis of cougar distribution and abundance in Montana, USA.
Riley, S J; Malecki, R A
2001-09-01
Recent growth in the distribution and abundance of cougars (Puma concolor) throughout western North America has created opportunities, challenges, and problems for wildlife managers and raises questions about what factors affect cougar populations. We present an analysis of factors thought to affect cougar distribution and abundance across the broad geographical scales on which most population management decisions are made. Our objectives were to: (1) identify and evaluate landscape parameters that can be used to predict the capability of habitats to support cougars, and (2) evaluate factors that may account for the recent expansion in cougar numbers. Habitat values based on terrain ruggedness and forested cover explained 73% of the variation in a cougar abundance index. Indices of cougar abundance also were spatially and temporally correlated with ungulate abundance. An increase in the number and total biomass of ungulate prey species is hypothesized to account for recent increases in cougars. Cougar populations in Montana are coping with land development by humans when other components of habitat and prey populations are sufficient. Our analysis provides a better understanding of what may have influenced recent growth in cougar distribution and abundance in Montana and, when combined with insights about stakeholder acceptance capacity, offers a basis for cougar management at broad scales. Long-term conservation of cougars necessitates a better understanding of ecosystem functions that affect prey distribution and abundance, more accurate estimates of cougar populations, and management abilities to integrate these components with human values. PMID:11531235
Analysis of spatial distribution of mining tremors occurring in Rudna copper mine (Poland)
NASA Astrophysics Data System (ADS)
Koz?owska, Maria
2013-10-01
The distribution of mining tremors is strictly related to the exploitation progress of mining works and, consequently, to the local stress field. In case the distribution is known, it is possible to determine future area of intensive seismicity in exploited mining panel. In the paper, an analysis of working face-to-tremor distance for Rudna copper mine in Poland is presented. In order to develop a spatial model of tremors' occurrence in the exploited mine, the seismicity of four mining sections in the five-month period was investigated and the tremors' distribution was obtained. It was compared with the spatial distribution of tremors in coal mines found in the literature. The results show that the places where tremors mostly occur — the vicinity of the face, in front of it — coincide with the high-stress area predicted by literature models. The obtained results help to predict the future seismic zone connected with planned mining section, which can be used in seismic hazard analysis.
Method of analysis of the spatial galaxy distribution at gigaparsec scales. I. Initial principles
Nabokov, Nikita
2010-01-01
Initial principles of a method of analysis of the luminous matter spatial distribution with sizes about thousands Mpc are presented. The method is based on an analysis of the photometric redshift distribution N(z) in the deep fields with large redshift bins \\Deltaz=0.1{\\div}0.3. Number density fluctuations in the bins are conditioned by the Poisson's noise, the correlated structures and the systematic errors of the photo-z determination. The method includes covering of a sufficiently large region on the sky by a net of the deep multiband surveys with the sell size about 10^{\\circ}x10^{\\circ} where individual deep fields have angular size about 10'x10' and may be observed at telescopes having diameters 3-10 meters. The distributions of photo-z within each deep field will give information about the radial extension of the super large structures while a comparison of the individual radial distributions of the net of the deep fields will give information on the tangential extension of the super large structures. ...
GRID Processing and analysis of ALICE data at distributed Russian Tier2 centre - RDIG
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Jancurova, L.; Kiryanov, A.; Kotlyar, V.; Mitsyn, V.; Lyublev, Y.; Ryabinkin, E.; Shabratova, G.; Stepanova, L.; Tikhomirov, V.; Trofimov, V.; Urazmetov, W.; Utkin, D.; Zarochentsev, A.; Zotkin, S.
2010-04-01
The major subject of this paper is the presentation of the distributed computing status report for the ALICE experiment at Russian sites just before the data taking at the Large Hadron Collider in CERN. We present the usage of the ALICE application software, AliEn[1], at the top of the modern EGEE middleware called gLite for the simulation and data analysis in the experiment at the Russian Tier2 in accordance with the ALICE computing model [2]. We outline the results of CPU and disk space usage at RDIG sites for the data simulation and analysis of first LHC data from the exposition of ALICE detector.
Some physics and system issues in the security analysis of quantum key distribution protocols
NASA Astrophysics Data System (ADS)
Yuen, Horace P.
2014-10-01
In this paper, we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks, which are not accounted for in the security analysis and proofs. Hence, the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.
Higher-security thresholds for quantum key distribution by improved analysis of dark counts
NASA Astrophysics Data System (ADS)
Boileau, J.-C.; Batuwantudawe, J.; Laflamme, R.
2005-09-01
We discuss the potential of quantum key distribution (QKD) for long-distance communication by proposing an analysis of the errors caused by dark counts. We give sufficient conditions for a considerable improvement of the key generation rates and the security thresholds of well-known QKD protocols such as the Bennett-Brassard 1984, Phoenix-Barnett-Chefles 2000, and six-state protocols. This analysis is applicable to other QKD protocols like the Bennett 1992 protocol. We examine two scenarios: a sender using a perfect single-photon source and a sender using a Poissonian source.
Higher-security thresholds for quantum key distribution by improved analysis of dark counts
Boileau, J.-C.; Laflamme, R.; Batuwantudawe, J.
2005-09-15
We discuss the potential of quantum key distribution (QKD) for long-distance communication by proposing an analysis of the errors caused by dark counts. We give sufficient conditions for a considerable improvement of the key generation rates and the security thresholds of well-known QKD protocols such as the Bennett-Brassard 1984, Phoenix-Barnett-Chefles 2000, and six-state protocols. This analysis is applicable to other QKD protocols like the Bennett 1992 protocol. We examine two scenarios: a sender using a perfect single-photon source and a sender using a Poissonian source.
NASA Astrophysics Data System (ADS)
Terres, Maria A.; Gelfand, Alan E.
2015-07-01
Typical ecological gradient analyses consider variation in the response of plants along a gradient of covariate values, but generally constrain themselves to predetermined response curves and ignore spatial autocorrelation. In this paper, we develop a formal spatial gradient analysis. We adopt the mathematical definition of gradients as directional rates of change with regard to a spatial surface. We view both the response and the covariate as spatial surfaces over a region of interest with respective gradient behavior. The gradient analysis we propose enables local comparison of these gradients. At any spatial location, we compare the behavior of the response surface with the behavior of the covariate surface to provide a novel form of sensitivity analysis. More precisely, we first fit a joint hierarchical Bayesian spatial model for a response variable and an environmental covariate. Then, after model fitting, at a given location, for each variable, we can obtain the posterior distribution of the derivative in any direction. We use these distributions to compute spatial sensitivities and angular discrepancies enabling a more detailed picture of the spatial nature of the response-covariate relationship. This methodology is illustrated using species presence probability as a response to elevation for two species of South African protea. We also offer a comparison with sensitivity analysis using geographically weighted regression. We show that the spatial gradient analysis allows for more extensive inference and provides a much richer description of the spatially varying relationships.
NASA Astrophysics Data System (ADS)
Terres, Maria A.; Gelfand, Alan E.
2015-06-01
Typical ecological gradient analyses consider variation in the response of plants along a gradient of covariate values, but generally constrain themselves to predetermined response curves and ignore spatial autocorrelation. In this paper, we develop a formal spatial gradient analysis. We adopt the mathematical definition of gradients as directional rates of change with regard to a spatial surface. We view both the response and the covariate as spatial surfaces over a region of interest with respective gradient behavior. The gradient analysis we propose enables local comparison of these gradients. At any spatial location, we compare the behavior of the response surface with the behavior of the covariate surface to provide a novel form of sensitivity analysis. More precisely, we first fit a joint hierarchical Bayesian spatial model for a response variable and an environmental covariate. Then, after model fitting, at a given location, for each variable, we can obtain the posterior distribution of the derivative in any direction. We use these distributions to compute spatial sensitivities and angular discrepancies enabling a more detailed picture of the spatial nature of the response-covariate relationship. This methodology is illustrated using species presence probability as a response to elevation for two species of South African protea. We also offer a comparison with sensitivity analysis using geographically weighted regression. We show that the spatial gradient analysis allows for more extensive inference and provides a much richer description of the spatially varying relationships.
Time-cost analysis of a quantum key distribution system clocked at 100 MHz
Xiaofan Mo; Itzel Lucio Martinez; Philip Chan; Chris Healey; Steve Hosier; Wolfgang Tittel
2011-05-18
We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed.
Time-cost analysis of a quantum key distribution system clocked at 100 MHz.
Mo, X F; Lucio-Martinez, I; Chan, P; Healey, C; Hosier, S; Tittel, W
2011-08-29
We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed. PMID:21935140
Swales, John G; Tucker, James W; Spreadborough, Michael J; Iverson, Suzanne L; Clench, Malcolm R; Webborn, Peter J H; Goodwin, Richard J A
2015-10-01
Liquid extraction surface analysis mass spectrometry (LESA-MS) is a surface sampling technique that incorporates liquid extraction from the surface of tissue sections with nanoelectrospray mass spectrometry. Traditional tissue analysis techniques usually require homogenization of the sample prior to analysis via high-performance liquid chromatography mass spectrometry (HPLC-MS), but an intrinsic weakness of this is a loss of all spatial information and the inability of the technique to distinguish between actual tissue penetration and response caused by residual blood contamination. LESA-MS, in contrast, has the ability to spatially resolve drug distributions and has historically been used to profile discrete spots on the surface of tissue sections. Here, we use the technique as a mass spectrometry imaging (MSI) tool, extracting points at 1 mm spatial resolution across tissue sections to build an image of xenobiotic and endogenous compound distribution to assess drug blood-brain barrier penetration into brain tissue. A selection of penetrant and "nonpenetrant" drugs were dosed to rats via oral and intravenous administration. Whole brains were snap-frozen at necropsy and were subsequently sectioned prior to analysis by matrix-assisted laser desorption ionization mass spectrometry imaging (MALDI-MSI) and LESA-MSI. MALDI-MSI, as expected, was shown to effectively map the distribution of brain penetrative compounds but lacked sufficient sensitivity when compounds were marginally penetrative. LESA-MSI was used to effectively map the distribution of these poorly penetrative compounds, highlighting its value as a complementary technique to MALDI-MSI. The technique also showed benefits when compared to traditional homogenization, particularly for drugs that were considered nonpenetrant by homogenization but were shown to have a measurable penetration using LESA-MSI. PMID:26350423
Three-dimensional radiation dose distribution analysis for boron neutron capture therapy
F. J. Wheeler; D. W. Nigg
1992-01-01
This paper reports that calculation of physically realistic radiation dose distributions for boron neutron capture therapy (BNCT) is a complex, three-dimensional problem. Traditional one-dimensional (slab) and two-dimensional (cylindrical) models, while useful for neutron beam design and performance analysis, do not provide sufficient accuracy for actual clinical use because the assumed symmetries inherent in such models do not ordinarily exist in
An exploratory spatial analysis of soil organic carbon distribution in Canadian eco-regions
NASA Astrophysics Data System (ADS)
Tan, S.-Y.; Li, J.
2014-11-01
As the largest carbon reservoir in ecosystems, soil accounts for more than twice as much carbon storage as that of vegetation biomass or the atmosphere. This paper examines spatial patterns of soil organic carbon (SOC) in Canadian forest areas at an eco-region scale of analysis. The goal is to explore the relationship of SOC levels with various climatological variables, including temperature and precipitation. The first Canadian forest soil database published in 1997 by the Canada Forest Service was analyzed along with other long-term eco-climatic data (1961 to 1991) including precipitation, air temperature, slope, aspect, elevation, and Normalized Difference Vegetation Index (NDVI) derived from remote sensing imagery. In addition, the existing eco-region framework established by Environment Canada was evaluated for mapping SOC distribution. Exploratory spatial data analysis techniques, including spatial autocorrelation analysis, were employed to examine how forest SOC is spatially distributed in Canada. Correlation analysis and spatial regression modelling were applied to determine the dominant ecological factors influencing SOC patterns at the eco-region level. At the national scale, a spatial error regression model was developed to account for spatial dependency and to estimate SOC patterns based on ecological and ecosystem factors. Based on the significant variables derived from the spatial error model, a predictive SOC map in Canadian forest areas was generated. Although overall SOC distribution is influenced by climatic and topographic variables, distribution patterns are shown to differ significantly between eco-regions. These findings help to validate the eco-region classification framework for SOC zonation mapping in Canada.
NASA Astrophysics Data System (ADS)
Xiaohong, C.
2014-12-01
Many probability distributions have been proposed for flood frequency analysis and several criteria have been used for selecting a best fitted distribution to an observed or generated data set by some random process. The upper tail of flood frequency distribution should be specifically concerned for flood control. However, different model selection criteria often result in different optimal distributions when focus on upper tail of flood frequency distribution. In this study, with emphasis on the upper-tail behavior, 5 distribution selection criteria including 2 hypothesis tests and 3 information-based criteria are evaluated in selecting the best fitted distribution from 8 widely used distributions (Pearson 3, Log-Pearson 3, two-parameter lognormal, three-parameter lognormal, Gumbel, Weibull, Generalized extreme value and Generalized logistic distributions) by using datasets from Thames River (UK), Wabash River (USA), Beijiang River and Huai River (China), which are all within latitude of 23.5-66.5 degrees north. The performance of the 5 selection criteria is verified by using a composite criterion focus on upper tail events defined in this study. This paper shows the approach for the optimal selection of suitable flood frequency distributions for different river basins. Results illustrate that (1) Different distributions are selected by using hypothesis tests and information-based criteria for each river. (2) The information-based criteria perform better than hypothesis tests in most cases when the focus is on the goodness of predictions of the extreme upper tail events. (3) In order to decide on a particular distribution to fit the high flow, it would be better to use the combination criteria, in which the information-based criteria can be used first to rank the models and the results are inspected by hypothesis testing methods. In addition, if the information-based criteria and hypothesis tests provide different results, the composite criterion will be taken for final decision.
NASA Technical Reports Server (NTRS)
Leybold, H. A.
1971-01-01
Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.
Strength distributions of adhesive bonded and adhesive/rivet combined joints
NASA Astrophysics Data System (ADS)
Imanaka, Makoto; Haraga, Kosuke; Nishikawa, Tetsuya
1992-11-01
The tensile and shear strengths of adhesive and adhesive/rivet combined joints are statistically evaluated, and the probability of failure is calculated for these two types of joints. Attention is given to the effects of the adhesive/rivet combination on mean tensile shear strength and coefficient of variation. The adhesive joint's strength distribution was well approximated by Weibull or doubly-exponential distribution function; tensile shear strength is significantly improved by the combination with rivets.
NASA Astrophysics Data System (ADS)
Tang, Nan; Marshall, Wallace F.
2013-02-01
Investigating the spatial information of cellular processes in tissues during mouse embryo development is one of the major technical challenges in development biology. Many imaging methods are still limited to the volumes of tissue due to tissue opacity, light scattering and the availability of advanced imaging tools. For analyzing the mitotic spindle angle distribution in developing mouse airway epithelium, we determined spindle angles in mitotic epithelial cells on serial sections of whole airway of mouse embryonic lungs. We then developed a computational image analysis to obtain spindle angle distribution in three dimensional airway reconstructed from the data obtained from all serial sections. From this study, we were able to understand how mitotic spindle angles are distributed in a whole airway tube. This analysis provides a potentially fast, simple and inexpensive alternative method to quantitatively analyze cellular process at subcellular resolution. Furthermore, this analysis is not limited to the size of tissues, which allows to obtain three dimensional and high resolution information of cellular processes in cell populations deeper inside intact organs.
NASA Astrophysics Data System (ADS)
Ying, Shen; Li, Lin; Gao, Yurong
2009-10-01
Spatial visibility analysis is the important direction of pedestrian behaviors because our visual conception in space is the straight method to get environment information and navigate your actions. Based on the agent modeling and up-tobottom method, the paper develop the framework about the analysis of the pedestrian flow depended on visibility. We use viewshed in visibility analysis and impose the parameters on agent simulation to direct their motion in urban space. We analyze the pedestrian behaviors in micro-scale and macro-scale of urban open space. The individual agent use visual affordance to determine his direction of motion in micro-scale urban street on district. And we compare the distribution of pedestrian flow with configuration in macro-scale urban environment, and mine the relationship between the pedestrian flow and distribution of urban facilities and urban function. The paper first computes the visibility situations at the vantage point in urban open space, such as street network, quantify the visibility parameters. The multiple agents use visibility parameters to decide their direction of motion, and finally pedestrian flow reach to a stable state in urban environment through the simulation of multiple agent system. The paper compare the morphology of visibility parameters and pedestrian distribution with urban function and facilities layout to confirm the consistence between them, which can be used to make decision support in urban design.
A distributed analysis and visualization system for model and observational data
NASA Technical Reports Server (NTRS)
Wilhelmson, Robert B.
1994-01-01
Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.
Extracted Fragment Ion Mobility Distributions: A New Method for Complex Mixture Analysis
Lee, Sunyoung; Li, Zhiyu; Valentine, Stephen J.; Zucker, Steven M.; Webber, Nathaniel; Reilly, James P.; Clemmer, David E.
2011-01-01
A new method is presented for constructing ion mobility distributions of precursor ions based upon the extraction of drift time distributions that are monitored for selected fragment ions. The approach is demonstrated with a recently designed instrument that combines ion mobility spectrometry (IMS) with ion trap mass spectrometry (MS) and ion fragmentation, as shown in a recent publication [J. Am. Soc. Mass Spectrom. 22 (2011) 1477–1485]. Here, we illustrate the method by examining selected charge states of electrosprayed ubiquitin ions, an extract from diesel fuel, and a mixture of phosphorylated peptide isomers. For ubiquitin ions, extraction of all drift times over small mass-to-charge (m/z) ranges corresponding to unique fragments of a given charge state allows the determination of precursor ion mobility distributions. A second example of the utility of the approach includes the distinguishing of precursor ion mobility distributions for isobaric, basic components from commercially available diesel fuel. Extraction of data for a single fragment ion is sufficient to distinguish the precursor ion mobility distribution of cycloalkyl-pyridine derivatives from pyrindan derivatives. Finally, the method is applied for the analysis of phosphopeptide isomers (LFpTGHPESLER and LFTGHPEpSLER) in a mixture. The approach alleviates several analytical challenges that include separation and characterization of species having similar (or identical) m/z values within complex mixtures. PMID:22518092
NASA Astrophysics Data System (ADS)
Candela, A.; Brigandí, G.; Aronica, G. T.
2014-01-01
In this paper a procedure to derive Flood Design Hydrographs (FDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) using copulas, which describe and model the correlation between these two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model is presented. Rainfall-runoff modelling for estimating the hydrological response at the outlet of a watershed used a conceptual fully distributed procedure based on the soil conservation service - curve number method as excess rainfall model and a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the definition of a distributed unit hydrograph, has been performed, implementing a procedure using flow paths determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the return period of the FDH which give the probability of occurrence of a hydrograph flood peaks and flow volumes obtained through R-R modeling has been statistically treated via copulas. The shape of hydrograph has been generated on the basis of a modeled flood events, via cluster analysis. The procedure described above was applied to a case study of Imera catchment in Sicily, Italy. The methodology allows a reliable and estimation of the Design Flood Hydrograph and can be used for all the flood risk applications, i.e. evaluation, management, mitigation, etc.
Mapping of aerosols' elemental distribution in two zones in Romania by PIXE analysis
NASA Astrophysics Data System (ADS)
Amemiya, Susumu; Masuda, Toshio; Popa-Simil, Liviu; Mateescu, Liviu
1996-09-01
In the summer of 1994 aerosol particles were collected from different places, using a portable stacked filter unit, with filters of 8 and 0.4 ?m. Sampling was performed in order to obtain the spatial distribution of elemental concentrations of aerosols. The Van de Graaff machine in Nagoya University was used for PIXE analysis of the samples. Results were processed both in Bucharest and in Nagoya. Iso-level maps for the concentration of each of the interesting elements were drawn. Correlation was made between the industry, vegetation, weather, local geography and the concentrations above-mentioned. Major industrial pollution sources were put into evidence. For example, the Si distribution in Bucharest and Dobrogea region turned to be in close link with the vegetation and surface water distribution. The ratio between coarse (8 ?m) and fine (0.4 ?m) particles is related to human activity (traffic, mining, buildings). Sulphur, in its turn, follows the territorial distribution of thermal power plants and refineries (fine particles), while coarse particles seem to concentrate in high traffic areas (Diesel engines). Pb concentrations, too, respect the traffic density distribution. More than 15 elements were mapped and interesting comments could be done.
Nearest-neighbor analysis and the distribution of sinkholes: an introduction to spatial statistics
NSDL National Science Digital Library
Rick Ford
This is an exercise I use in an upper-division geomorphology course to introduce students to nearest-neighbor analysis, a basic technique in spatial statistics. Nearest-neighbor analysis is a method of comparing the observed average distance between points and their nearest neighbor to the expected average nearest-neighbor distance in a random pattern of points. The pattern of points on a map or 2-D graph can be classified into three categories: CLUSTERED, RANDOM, REGULAR. Nearest-neighbor analysis provides an objective method for distinguishing among these possible spatial distributions. The technique also produces a population statistic, the nearest-neighbor index, which can be compared from area to area. In general, nearest-neighbor analysis can be applied to any geoscience phenomenon or feature whose spatial distribution can be categorized as a point pattern. The basic distance data can come from topographic maps, aerial photographs, or field measurements. The exercise presented here applies this technique to the study of karst landforms on topographic maps, specifically the spatial distribution of sinkholes. The advantages of introducing nearest-neighbor analysis in an undergraduate lab is that: (1) it reinforces important concepts related to data collection (e.g significant figures), map use (e.g. scale and the UTM grid), and basic statistics (e.g. hypothesis testing); (2) the necessary calculations are easily handled by most students; and (3) once learned, the technique can be widely applied in geoscience problem-solving. Designed for a geomorphology course Addresses student fear of quantitative aspect and/or inadequate quantitative skills
A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules
Stauch, Tim; Dreuw, Andreas, E-mail: dreuw@uni-heidelberg.de [Interdisciplinary Center for Scientific Computing, University of Heidelberg, Im Neuenheimer Feld 368, 69120 Heidelberg (Germany)] [Interdisciplinary Center for Scientific Computing, University of Heidelberg, Im Neuenheimer Feld 368, 69120 Heidelberg (Germany)
2014-04-07
The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.
NASA Astrophysics Data System (ADS)
Alfeld, Matthias; Wahabzada, Mirwaes; Bauckhage, Christian; Kersting, Kristian; Wellenreuther, Gerd; Falkenberg, Gerald
2014-04-01
Stacks of elemental distribution images acquired by XRF can be difficult to interpret, if they contain high degrees of redundancy and components differing in their quantitative but not qualitative elemental composition. Factor analysis, mainly in the form of Principal Component Analysis (PCA), has been used to reduce the level of redundancy and highlight correlations. PCA, however, does not yield physically meaningful representations as they often contain negative values. This limitation can be overcome, by employing factor analysis that is restricted to non-negativity. In this paper we present the first application of the Python Matrix Factorization Module (pymf) on XRF data. This is done in a case study on the painting Saul and David from the studio of Rembrandt van Rijn. We show how the discrimination between two different Co containing compounds with minimum user intervention and a priori knowledge is supported by Non-Negative Matrix Factorization (NMF).
Pokhrel, Keshav P; Vovoras, Dimitrios; Tsokos, Chris P
2012-09-01
The examination of brain tumor growth and its variability among cancer patients is an important aspect of epidemiologic and medical data. Several studies for tumors of brain interpreted descriptive data, in this study we perform inference in the extent possible, suggesting possible explanations for the differentiation in the survival rates apparent in the epidemiologic data. Population based information from nine registries in the USA are classified with respect to age, gender, race and tumor histology to study tumor size variation. The Weibull and Dagum distributions are fitted to the highly skewed tumor sizes distributions, the parametric analysis of the tumor sizes showed significant differentiation between sexes, increased skewness for both the male and female populations, as well as decreased kurtosis for the black female population. The effect of population characteristics on the distribution of tumor sizes is estimated by quantile regression model and then compared with the ordinary least squares results. The higher quantiles of the distribution of tumor sizes for whites are significantly higher than those of other races. Our model predicted that the effect of age in the lower quantiles of the tumor sizes distribution is negative given the variables race and sex. We apply probability and regression models to explore the effects of demographic and histology types and observe significant racial and gender differences in the form of the distributions. Efforts are made to link tumor size data with available survival rates in relation to other prognostic variables. PMID:23675268
Pokhrel, Keshav P.; Vovoras, Dimitrios; Tsokos, Chris P.
2012-01-01
The examination of brain tumor growth and its variability among cancer patients is an important aspect of epidemiologic and medical data. Several studies for tumors of brain interpreted descriptive data, in this study we perform inference in the extent possible, suggesting possible explanations for the differentiation in the survival rates apparent in the epidemiologic data. Population based information from nine registries in the USA are classified with respect to age, gender, race and tumor histology to study tumor size variation. The Weibull and Dagum distributions are fitted to the highly skewed tumor sizes distributions, the parametric analysis of the tumor sizes showed significant differentiation between sexes, increased skewness for both the male and female populations, as well as decreased kurtosis for the black female population. The effect of population characteristics on the distribution of tumor sizes is estimated by quantile regression model and then compared with the ordinary least squares results. The higher quantiles of the distribution of tumor sizes for whites are significantly higher than those of other races. Our model predicted that the effect of age in the lower quantiles of the tumor sizes distribution is negative given the variables race and sex. We apply probability and regression models to explore the effects of demographic and histology types and observe significant racial and gender differences in the form of the distributions. Efforts are made to link tumor size data with available survival rates in relation to other prognostic variables. PMID:23675268
Godin, Antoine G; Rappaz, Benjamin; Potvin-Trottier, Laurent; Kennedy, Timothy E; De Koninck, Yves; Wiseman, Paul W
2015-08-18
Knowledge of membrane receptor organization is essential for understanding the initial steps in cell signaling and trafficking mechanisms, but quantitative analysis of receptor interactions at the single-cell level and in different cellular compartments has remained highly challenging. To achieve this, we apply a quantitative image analysis technique-spatial intensity distribution analysis (SpIDA)-that can measure fluorescent particle concentrations and oligomerization states within different subcellular compartments in live cells. An important technical challenge faced by fluorescence microscopy-based measurement of oligomerization is the fidelity of receptor labeling. In practice, imperfect labeling biases the distribution of oligomeric states measured within an aggregated system. We extend SpIDA to enable analysis of high-order oligomers from fluorescence microscopy images, by including a probability weighted correction algorithm for nonemitting labels. We demonstrated that this fraction of nonemitting probes could be estimated in single cells using SpIDA measurements on model systems with known oligomerization state. Previously, this artifact was measured using single-step photobleaching. This approach was validated using computer-simulated data and the imperfect labeling was quantified in cells with ion channels of known oligomer subunit count. It was then applied to quantify the oligomerization states in different cell compartments of the proteolipid protein (PLP) expressed in COS-7 cells. Expression of a mutant PLP linked to impaired trafficking resulted in the detection of PLP tetramers that persist in the endoplasmic reticulum, while no difference was measured at the membrane between the distributions of wild-type and mutated PLPs. Our results demonstrate that SpIDA allows measurement of protein oligomerization in different compartments of intact cells, even when fractional mislabeling occurs as well as photobleaching during the imaging process, and reveals insights into the mechanism underlying impaired trafficking of PLP. PMID:26287623
NASA Technical Reports Server (NTRS)
Schmeckpeper, K. R.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 1671 failure modes analyzed, 9 single failures were determined to result in loss of crew or vehicle. Three single failures unique to intact abort were determined to result in possible loss of the crew or vehicle. A possible loss of mission could result if any of 136 single failures occurred. Six of the criticality 1/1 failures are in two rotary and two pushbutton switches that control External Tank and Solid Rocket Booster separation. The other 6 criticality 1/1 failures are fuses, one each per Aft Power Control Assembly (APCA) 4, 5, and 6 and one each per Forward Power Control Assembly (FPCA) 1, 2, and 3, that supply power to certain Main Propulsion System (MPS) valves and Forward Reaction Control System (RCS) circuits.
NASA Astrophysics Data System (ADS)
Berezowski, Tomasz; Chorma?ski, Jaros?aw; Nossent, Jiri; Batelaan, Okke
2014-05-01
Distributed hydrological models enhance the analysis and explanation of environmental processes. As more spatial input data and time series become available, more analysis is required of the sensitivity of the data on the simulations. Most research so far focussed on the sensitivity of precipitation data in distributed hydrological models. However, these results can not be compared until a universal approach to quantify the sensitivity of a model to spatial data is available. The frequently tested and used remote sensing data for distributed models is snow cover. Snow cover fraction (SCF) remote sensing products are easily available from the internet, e.g. MODIS snow cover product MOD10A1 (daily snow cover fraction at 500m spatial resolution). In this work a spatial sensitivity analysis (SA) of remotely sensed SCF from MOD10A1 was conducted with the distributed WetSpa model. The aim is to investigate if the WetSpa model is differently subjected to SCF uncertainty in different areas of the model domain. The analysis was extended to look not only at SA quantities but also to relate them to the physical parameters and processes in the study area. The study area is the Biebrza River catchment, Poland, which is considered semi natural catchment and subject to a spring snow melt regime. Hydrological simulations are performed with the distributed WetSpa model, with a simulation period of 2 hydrological years. For the SA the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm is used, with a set of different response functions in regular 4 x 4 km grid. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different landscape features. Moreover, the spatial patterns of the SA results are related to the WetSpa spatial parameters and to different physical processes. Based on the study results, it is clear that spatial approach of SA can be performed with the proposed algorithm and the MOD10A1 SCF is spatially sensitive in the WetSpa model.
Barbosa, F G; Schneck, F; Melo, A S
2012-11-01
We conducted a scientometric analysis to determine the main trends and gaps of studies on the use of ecological niche models (ENMs) to predict the distribution of invasive species. We used the database of the Thomson Institute for Scientific Information (ISI). We found 190 papers published between 1991 and 2010 in 82 journals. The number of papers was low in the 1990s, but began to increase after 2003. One-third of the papers were published by researchers from the United States of America, and consequently, the USA was also the most studied region. The majority of studies were carried out in terrestrial environments, while only a few investigated aquatic systems, probably because important aquatic predictor variables are scarce or unavailable for most regions in the world. Species-occurrence records were mainly composed of presence-only records, and almost 70% of the studies were carried out with plants and insects. Twenty-three different distribution modelling methods were used. The Genetic Algorithm for Rule-set Production (GARP) was used most often. Our scientometric analysis showed a growing interest in the use of ENMs to predict the distribution of invasive species, especially in the last decade, which is probably related to the increase in species introductions worldwide. Among some important gaps that need to be filled, the relatively small number of studies conducted in developing countries and in aquatic environments deserves careful attention. PMID:23295510
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Cipelletti, Luca; Biron, Jean-Philippe; Martin, Michel; Cottet, Hervé
2015-08-18
Taylor dispersion analysis is an absolute and straightforward characterization method that allows determining the diffusion coefficient, or equivalently the hydrodynamic radius, from angstroms to submicron size range. In this work, we investigated the use of the Constrained Regularized Linear Inversion approach as a new data processing method to extract the probability density functions of the diffusion coefficient (or hydrodynamic radius) from experimental taylorgrams. This new approach can be applied to arbitrary polydisperse samples and gives access to the whole diffusion coefficient distributions, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method was successfully applied to both simulated and real experimental data for solutions of moderately polydisperse polymers and their binary and ternary mixtures. Distributions of diffusion coefficients obtained by this method were favorably compared with those derived from size exclusion chromatography. The influence of the noise of the simulated taylorgrams on the data processing is discussed. Finally, we discuss the ability of the method to correctly resolve bimodal distributions as a function of the relative separation between the two constituent species. PMID:26243023
Application of digital image analysis for size distribution measurements of microbubbles
Burns, S.E.; Yiacoumi, S.; Frost, D.; Tsouris, C.
1997-03-01
This work employs digital image analysis to measure the size distribution of microbubbles generated by the process of electroflotation for use in solid/liquid separation processes. Microbubbles are used for separations in the mineral processing industry and also in the treatment of potable water and wastewater.As the bubbles move upward in a solid/liquid column due to buoyancy, particles collide with and attach to the bubbles and are carried to the surface of the column where they are removed by skimming. The removal efficiency of solids is strongly affected by the size of the bubbles. In general, higher separation is achieved by a smaller bubble size. The primary focus of this study was to characterize the size and size distribution of bubbles generated in electroflotation using image analysis. The study found that bubble diameter increased slightly as the current density applied to the system was increased. Additionally, electroflotation produces a uniform bubble size with narrow distribution which optimizes the removal of fine particles from solution.
Rural tourism spatial distribution based on multi-criteria decision analysis and GIS
NASA Astrophysics Data System (ADS)
Zhang, Hongxian; Yang, Qingsheng
2008-10-01
To study spatial distribution of rural tourism can provide scientific decision basis for developing rural economics. Traditional ways of tourism spatial distribution have some limitations in quantifying priority locations of tourism development on small units. They can only produce the overall tourism distribution locations and whether locations are suitable to tourism development simply while the tourism develop ranking with different decision objectives should be considered. This paper presents a way to find ranking of location of rural tourism development in spatial by integrating multi-criteria decision analysis (MCDA) and geography information system (GIS). In order to develop country economics with inconvenient transportation, undeveloped economy and better tourism resource, these locations should be firstly develop rural tourism. Based on this objective, the tourism develop priority utility of each town is calculated with MCDA and GIS. Towns which should be first develop rural tourism can be selected with higher tourism develop priority utility. The method is used to find ranking of location of rural tourism in Ningbo City successfully. The result shows that MCDA is an effective way for distribution rural tourism in spatial based on special decision objectives and rural tourism can promote economic development.
NASA Astrophysics Data System (ADS)
Baitimirova, M.; Osite, A.; Katkevics, J.; Viksna, A.
2012-08-01
Burning of candles generates particulate matter of fine dimensions that produces poor indoor air quality, so it may cause harmful impact on human health. In this study solid aerosol particles of burning of candles of different composition and kerosene combustion were collected in a closed laboratory system. Present work describes particulate matter collection for structure analysis and the relationship between source and size distribution of particulate matter. The formation mechanism of particulate matter and their tendency to agglomerate also are described. Particles obtained from kerosene combustion have normal size distribution. Whereas, particles generated from the burning of stearin candles have distribution shifted towards finer particle size range. If an additive of stearin to paraffin candle is used, particle size distribution is also observed in range of towards finer particles. A tendency to form agglomerates in a short time is observed in case of particles obtained from kerosene combustion, while in case of particles obtained from burning of candles of different composition such a tendency is not observed. Particles from candles and kerosene combustion are Aitken and accumulation mode particles
NASA Astrophysics Data System (ADS)
Kundel, Harold L.; Polansky, Marcia
1998-04-01
Mixture distribution analysis (MDA) is proposed as a statistical methodology for comparing observer readings on different imaging modalities when the image findings cannot be independently verified. The study utilized a data set consisting of independent, blinded readings by 4 radiologists of a stratified sample of 95 bedside chest images obtained using computed radiography. Each case was rad on hard and soft copy. The area under the ROC curve (AUC) was calculated using ROCFIT and the relative percent correct (RPC) was calculated from point distributions estimated by the MDA. The expectation maximization algorithm was used to perform a maximum likelihood estimation of the fit to either 3, 4 or 5 point distributions. There was agreement between the AUC and the RPC based upon 3 point distributions representing easy normals, hard normals and abnormals, easy abnormals, hard normals, hard abnormals and easy abnormals. We conclude that the MDA may be a viable alternative to the ROC for evaluating observer performance on imaging modalities in clinical settings where image verification is either difficult or impossible.
NASA Astrophysics Data System (ADS)
Doebrich, Marcus; Markstaller, Klaus; Karmrodt, Jens; Kauczor, Hans-Ulrich; Eberle, Balthasar; Weiler, Norbert; Thelen, Manfred; Schreiber, Wolfgang G.
2005-04-01
In this study, an algorithm was developed to measure the distribution of pulmonary time constants (TCs) from dynamic computed tomography (CT) data sets during a sudden airway pressure step up. Simulations with synthetic data were performed to test the methodology as well as the influence of experimental noise. Furthermore the algorithm was applied to in vivo data. In five pigs sudden changes in airway pressure were imposed during dynamic CT acquisition in healthy lungs and in a saline lavage ARDS model. The fractional gas content in the imaged slice (FGC) was calculated by density measurements for each CT image. Temporal variations of the FGC were analysed assuming a model with a continuous distribution of exponentially decaying time constants. The simulations proved the feasibility of the method. The influence of experimental noise could be well evaluated. Analysis of the in vivo data showed that in healthy lungs ventilation processes can be more likely characterized by discrete TCs whereas in ARDS lungs continuous distributions of TCs are observed. The temporal behaviour of lung inflation and deflation can be characterized objectively using the described new methodology. This study indicates that continuous distributions of TCs reflect lung ventilation mechanics more accurately compared to discrete TCs.
NASA Astrophysics Data System (ADS)
Alves, L. G. A.; Ribeiro, H. V.; Lenzi, E. K.; Mendes, R. S.
2014-09-01
We report on the existing connection between power-law distributions and allometries. As it was first reported in Gomez-Lievano et al. (2012) for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labor, illiteracy, income, sanitation and unemployment). Our analysis reveals that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residuals fluctuations surrounding the allometries are characterized by an almost constant variance and log-normal distributions.
Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature
NASA Technical Reports Server (NTRS)
Yoo, Paul
2013-01-01
Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.
Pore space analysis of NAPL distribution in sand-clay media
Matmon, D.; Hayden, N.J.
2003-01-01
This paper introduces a conceptual model of clays and non-aqueous phase liquids (NAPLs) at the pore scale that has been developed from a mathematical unit cell model, and direct micromodel observation and measurement of clay-containing porous media. The mathematical model uses a unit cell concept with uniform spherical grains for simulating the sand in the sand-clay matrix (???10% clay). Micromodels made with glass slides and including different clay-containing porous media were used to investigate the two clays (kaolinite and montmorillonite) and NAPL distribution within the pore space. The results were used to understand the distribution of NAPL advancing into initially saturated sand and sand-clay media, and provided a detailed analysis of the pore-scale geometry, pore size distribution, NAPL entry pressures, and the effect of clay on this geometry. Interesting NAPL saturation profiles were observed as a result of the complexity of the pore space geometry with the different packing angles and the presence of clays. The unit cell approach has applications for enhancing the mechanistic understanding and conceptualization, both visually and mathematically, of pore-scale processes such as NAPL and clay distribution. ?? 2003 Elsevier Science Ltd. All rights reserved.
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin
2013-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451
Phylogenetic Analysis and Comparative Genomics of Purine Riboswitch Distribution in Prokaryotes
Singh, Payal; Sengupta, Supratim
2012-01-01
Riboswitches are regulatory RNA that control gene expression by undergoing conformational changes on ligand binding. Using phylogenetic analysis and comparative genomics we have been able to identify the class of genes/operons regulated by the purine riboswitch and obtain a high-resolution map of purine riboswitch distribution across all bacterial groups. In the process, we are able to explain the absence of purine riboswitches upstream to specific genes in certain genomes. We also identify the point of origin of various purine riboswitches and argue that not all purine riboswitches are of primordial origin, and that some purine riboswitches must have originated after the divergence of certain Firmicute orders in the course of evolution. Our study also reveals the role of horizontal transfer events in accounting for the presence of purine riboswitches in some gammaproteobacterial species. Our work provides significant insights into the origin, distribution and regulatory role of purine riboswitches in prokaryotes. PMID:23170063
NASA Technical Reports Server (NTRS)
Costello, Thomas A.; Brandt, C. Maite
1989-01-01
Simulation and analysis results are described for a wideband fiber optic intermediate frequency distribution channel for a frequency division multiple access (FDMA) system where antenna equipment is remotely located from the signal processing equipment. The fiber optic distribution channel accommodates multiple signals received from a single antenna with differing power levels. Performance parameters addressed are intermodulation degradations, laser noise, and adjacent channel interference, as they impact the overall system design. Simulation results showed that the laser diode modulation level can be allowed to reach 100 percent without considerable degradation. The laser noise must be controlled as to provide a noise floor of less than -90 dBW/Hz. The fiber optic link increases the degradation due to power imbalance yet diminishes the effects of the transmit amplifier nonlinearity. Overall, optimal operation conditions can be found to yield a degradation level of about .1 dB caused by the fiber optic link.
Analysis of radial distribution of plasma parameters in a coaxial-line microwave discharge tube
NASA Astrophysics Data System (ADS)
Kato, Isamu; Hara, Shinji; Wakana, Shin-ichi
1983-09-01
The discharge mechanism in a coaxial-line microwave discharge tube that makes uniform plasma along the circumference of the tube has been analyzed. The analysis includes the skin effect of microwave penetration into the plasma and the equilibrium between generation and losses of the charged particles. It is shown that the radial distributions of electron density and electron temperature can calculated from measured values of the radial distribution of light intensity. The electron density is about 1012 cm-3 near the center axis of the plasma column and is close to trapezoidal in shape. The electron temperature is about 4×104 K near the tube wall and decreases monotonically toward the center of the plasma column.
Electron-beam radial distribution analysis of irradiation-induced amorphous SiC
NASA Astrophysics Data System (ADS)
Ishimaru, Manabu
2006-09-01
Advanced electron microscopy techniques have been employed to examine atomistic structures of ion-beam-induced amorphous silicon carbide (SiC). Single crystals of 4H-SiC were irradiated at a cryogenic temperature (120 K) with 300 keV Xe ions to a fluence of 10 15 cm -2. A continuous amorphous layer formed on the topmost layer of the SiC substrate was characterized by energy-filtering transmission electron microscopy in combination with imaging plate techniques. Atomic pair-distribution functions obtained by a quantitative analysis of energy-filtered electron diffraction patterns revealed that amorphous SiC networks contain heteronuclear Si-C bonds, as well as homonuclear Si-Si and C-C bonds, within the first coordination shell. The effects of inelastically-scattered electrons on atomic pair-distribution functions were discussed.
EXERGY ANALYSIS OF THE CRYOGENIC HELIUM DISTRIBUTION SYSTEM FOR THE LARGE HADRON COLLIDER (LHC)
Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.
2010-04-09
The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.
Quantum-dot size-distribution analysis and precipitation stages in semiconductor doped glasses
NASA Astrophysics Data System (ADS)
Liu, Li-Chi; Risbud, Subhash H.
1990-07-01
The sequence of stages during precipitation of semiconductor (e.g., CdS, CdSe) clusters from supersaturated glasses exhibiting quantum-confinement effects was investigated. The rate of formation of nanometer-size ``quantum dots'' distributed in a continuous glass matrix is critically determined by the time and temperature of the heat treatment given to the quenched glasses. The entire precipitation process was analyzed in terms of several decomposition stages: nucleation, normal growth, coalescence of quantum dots, and devitrification of the glass matrix itself. Experimental data obtained by differential thermal analysis were utilized to identify the heat-treatment temperature range associated with the precipitation stages. The size distribution of CdSe quantum-dot clusters was analyzed using our transmission electron microscopy data. The data of Ekimov et al. [Solid State Commun. 56, 921 (1975)] was reduced to time-temperature master plots useful for precipitating quantum dots of a given size in glasses.
Streekstra, G. J.; Atasever, B.; van Zijderveld, R.; Ince, C.
2008-01-01
This study describes a new method for analyzing microcirculatory videos. It introduces algorithms for quantitative assessment of vessel length, diameter, the functional microcirculatory density distribution and red blood-cell (RBC) velocity in individual vessels as well as its distribution. The technique was validated and compared to commercial software. The method was applied to the sublingual microcirculation in a healthy volunteer and in a patient during cardiac surgery. Analysis time was reduced from hours to minutes compared to previous methods requiring manual vessel identification. Vessel diameter was detected with high accuracy (>80%, d > 3 pixels). Capillary length was estimated within 5 pixels accuracy. Velocity estimation was very accurate (>95%) in the range [2.5, 1,000] pixels/s. RBC velocity was reduced by 70% during the first 10 s of cardiac luxation. The present method has been shown to be fast and accurate and provides increased insight into the functional properties of the microcirculation. PMID:18427850
Analysis of temperature distribution in a pipe with inner mineral deposit
NASA Astrophysics Data System (ADS)
Joachimiak, Magda; Cia?kowski, Micha?; Bartoszewicz, Jaros?aw
2014-06-01
The paper presents the results of calculations related to determination of temperature distributions in a steel pipe of a heat exchanger taking into account inner mineral deposits. Calculations have been carried out for silicate-based scale being characterized by a low heat transfer coefficient. Deposits of the lowest values of heat conduction coefficient are particularly impactful on the strength of thermally loaded elements. In the analysis the location of the thermocouple and the imperfection of its installation were taken into account. The paper presents the influence of determination accuracy of the heat flux on the pipe external wall on temperature distribution. The influence of the heat flux disturbance value on the thickness of deposit has also been analyzed.
Preliminary analysis of the span-distributed-load concept for cargo aircraft design
NASA Technical Reports Server (NTRS)
Whitehead, A. H., Jr.
1975-01-01
A simplified computer analysis of the span-distributed-load airplane (in which payload is placed within the wing structure) has shown that the span-distributed-load concept has high potential for application to future air cargo transport design. Significant increases in payload fraction over current wide-bodied freighters are shown for gross weights in excess of 0.5 Gg (1,000,000 lb). A cruise-matching calculation shows that the trend toward higher aspect ratio improves overall efficiency; that is, less thrust and fuel are required. The optimal aspect ratio probably is not determined by structural limitations. Terminal-area constraints and increasing design-payload density, however, tend to limit aspect ratio.
Holländer, H; Wickelmaier, M; Pastor, W
1976-05-01
A coordinate recording microscope equipped with a rolling disc planimeter is described. The application of the instrument for the analysis of cell size distribution in histologic preparations is demonstrated. The microscope stage is moved in x, and y directions by two digital micrometer spindles. The planimeter is equipped with a rotary encoder. The spindles and the rotary encoder are connected to digital counters which are input to a multiplexer. The multiplexer outputs the data to a teletype terminal both as hard copy and on paper tape. To avoid repeated measurements a video-acoustic monitoring system is provided. Processing of the data is carried out off-line with a IBM 1130 computer. Here the measured cells are displayed as points or circles on an oscilloscope. The distribution of cells of any size can be studied topographically in any region of the field of measurements. PMID:967019
ImageJ analysis of dentin tubule distribution in human teeth.
Williams, Casia; Wu, Yiching; Bowers, Doria F
2015-08-01
Mapping the distribution of dentin tubules is vital to understanding the structure-function relationship of dentin, an important indicator of tooth stability. This study compared the distances between and density of tubules in the external dentin located in the crown region of an adult human incisor and molar to determine if analysis could be conducted using light-level microscopy. Teeth were processed for routine histology, cut in cross-section, images captured using Advanced SPOT Program, and microstructure was analyzed using ImageJ (NIH). Intratubular (peritubular) dentin with or without odontoblast processes were observed and although incisor and molar images appeared visually similar, plot profile graphs differed. Distance-intervals between tubules in the incisor (5.45-7.67?m) had an overall range of 2.22?m and in the molar (7.43-8.42?m) an overall range of 0.99?m. While molar tubule distribution displayed a tighter overall range, there was a smaller distance between most incisor tubules. The average densities observed in incisors were 15,500tubules/mm(2), compared with 20,100tubules/mm(2) in molars. ImageJ analysis of prepared histology microscopic slides provides researchers with a rapid, inexpensive assessment tool when compared with advanced/ultrastructural methodologies. By combining routine histological processing and light microscopic observations followed by ImageJ analysis, tooth structure can be converted into numerical data and easily mastered by laboratory personnel. PMID:26150311
Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.
1980-05-01
On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates.
Barnes, P.R.; Das, S.; McConnell, B.W.; Van Dyke, J.W.
1997-09-01
This report contains additional information for use by the US Department of Energy in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. An earlier determination study by the Oak Ridge National Laboratory determined that cost-effective, technically feasible energy savings could be achieved by distribution transformer standards and that these savings are significant relative to other product conservation standards. This study was documented in a final report, ``Determination Analysis of Energy Conservation Standards for Distribution Transformers`` (ORNL-6847, July 1996). The energy conservation options analyzed in this study were estimated to save 5.2 to 13.7 quads from 2000--2030. The energy savings for the determination study cases have been revised downward for a number of reasons. The transformer market, both present and future, was overestimated in the previous study, particularly for dry-type transformers, which have the greatest energy-saving potential. Moreover, a revision downwards of the effective annual loads for utility owned transformers also results in lower energy savings. The present study assesses four of the five conservation cases from the earlier determination study as well as the National Electrical Manufacturers Association energy efficiency standard NEMA TP 1-1996 using the updated data and a more accurate disaggregated analysis model. According to these new estimates, the savings ranged from 2.5 to 10.7 quads of primary energy for the 30-year period 2004 to 2034. For the TP-1 case, data were available to calculate the payback period required to recover the extra cost from the value of the energy saved. The average payback period based on the average national cost of electricity is 2.76 years. 15 figs., 23 tabs.
Three-dimensional radiation dose distribution analysis for boron neutron capture therapy
Wheeler, F.J.; Nigg, D.W. (Idaho National Engineering Lab., EG and G Idaho, Inc., Idaho Falls, ID (US))
1992-01-01
This paper reports that calculation of physically realistic radiation dose distributions for boron neutron capture therapy (BNCT) is a complex, three-dimensional problem. Traditional one-dimensional (slab) and two-dimensional (cylindrical) models, while useful for neutron beam design and performance analysis, do not provide sufficient accuracy for actual clinical use because the assumed symmetries inherent in such models do not ordinarily exist in the real world. Fortunately, however, it is no longer necessary to make these types of simplifying assumptions. Recent dramatic advances in computing technology have brought full three-dimensional dose distribution calculations for BNCT into the realm of practicality for a wide variety of routine applications. Once a geometric model and the appropriate material compositions have been determined, either stochastic (Monte Carlo) or deterministic calculations of all dose components of interest can now be performed more rapidly and inexpensively for the true three-dimensional geometries typical of actual clinical applications of BNCT. Demonstrations of both Monte Carlo and Deterministic techniques for performing three-dimensional dose distribution analysis for BNCT are provided. Calculated results are presented for a three-dimensional Lucite canine-head phantom irradiated in the epithermal neutron beam available at the Brookhaven Medical Research Reactor. The deterministic calculations are performed using the three-dimensional discrete ordinates method. The Monte Carlo calculations employ a novel method for obtaining spatially detailed radiation flux and dose distributions without the use of flux-at-a-point estimators. The calculated results are in good agreement with each other and with thermal neutron flux measurements taken using copper-gold flux wires placed at various locations in the phantom.
Equity in the distribution of CT and MRI in China: a panel analysis
2013-01-01
Introduction China is facing a daunting challenge to health equity in the context of rapid economic development. This study adds to the literature by examining equity in the distribution of high-technology medical equipment, such as CT and MRI, in China. Methods A panel analysis was conducted with information about four study sites in 2006 and 2009. The four provincial-level study sites included Shanghai, Zhejiang, Shaanxi, and Hunan, representing different geographical, economic, and medical technology levels in China. A random sample of 71 hospitals was selected from the four sites. Data were collected through questionnaire surveys. Equity status was assessed in terms of CT and MRI numbers, characteristics of machine, and financing sources. The assessment was conducted at multiple levels, including international, provincial, city, and hospital level. In addition to comparison among the study sites, the sample was compared with OECD countries in CT and MRI distributions. Results China had lower numbers of CTs and MRIs per million population in 2009 than most of the selected OECD countries while the increases in its CT and MRI numbers from 2006 to 2009 were higher than most of the OECD countries. The equity status of CT distribution remained at low inequality level in both 2006 and 2009 while the equity status of MRI distribution improved from high inequality in 2006 to moderate inequality in 2009. Despite the equity improvement, the distributions of CTs and MRIs were significantly positively correlated with economic development level across all cities in the four study sites in either 2006 or 2009. Our analysis also revealed that Shanghai, the study site with the highest level of economic development, had more advanced CT and MRI machine, more imported CTs and MRIs, and higher government subsidies on these two types of equipment. Conclusions The number of CTs and MRIs increased considerably in China from 2006 to 2009. The equity status of CTs was better than that of MRIs although the equity status in MRI distribution got improved from 2006 to 2009. Still considerable inequality exists in terms of characteristics and financing of CTs and MRIs. PMID:23742755
Flow distribution analysis on the cooling tube network of ITER thermal shield
NASA Astrophysics Data System (ADS)
Nam, Kwanwoo; Chung, Wooho; Noh, Chang Hyun; Kang, Dong Kwon; Kang, Kyoung-O.; Ahn, Hee Jae; Lee, Hyeon Gon
2014-01-01
Thermal shield (TS) is to be installed between the vacuum vessel or the cryostat and the magnets in ITER tokamak to reduce the thermal radiation load to the magnets operating at 4.2K. The TS is cooled by pressurized helium gas at the inlet temperature of 80K. The cooling tube is welded on the TS panel surface and the composed flow network of the TS cooling tubes is complex. The flow rate in each panel should be matched to the thermal design value for effective radiation shielding. This paper presents one dimensional analysis on the flow distribution of cooling tube network for the ITER TS. The hydraulic cooling tube network is modeled by an electrical analogy. Only the cooling tube on the TS surface and its connecting pipe from the manifold are considered in the analysis model. Considering the frictional factor and the local loss in the cooling tube, the hydraulic resistance is expressed as a linear function with respect to mass flow rate. Sub-circuits in the TS are analyzed separately because each circuit is controlled by its own control valve independently. It is found that flow rates in some panels are insufficient compared with the design values. In order to improve the flow distribution, two kinds of design modifications are proposed. The first one is to connect the tubes of the adjacent panels. This will increase the resistance of the tube on the panel where the flow rate is excessive. The other design suggestion is that an orifice is installed at the exit of tube routing where the flow rate is to be reduced. The analysis for the design suggestions shows that the flow mal-distribution is improved significantly.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1992-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Knowledge-based assistance for science visualization and analysis using large distributed databases
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.
1993-01-01
Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.
Damos, Petros; Soulopoulou, Polyxeni
2015-01-01
Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging) since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off), but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard rate model illustrations and maximum likelihoods may be usefully in defining periods of mortality leveling off and provide clear evidence that environmental variability may affect parameter estimates and insect population failure rate. From a reliability theory standpoint, failure rates vary according to a linear function of age at the extremes indicating that the life system (i.e., population) is able to eliminate earlier failure and/or to keep later failure rates constant. The applied model was able to identify the major correlates of extended longevity and to suggest new ideas for using demographic concepts in both basic and applied population biology and aging. PMID:26317217
Damos, Petros; Soulopoulou, Polyxeni
2015-01-01
Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging) since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off), but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard rate model illustrations and maximum likelihoods may be usefully in defining periods of mortality leveling off and provide clear evidence that environmental variability may affect parameter estimates and insect population failure rate. From a reliability theory standpoint, failure rates vary according to a linear function of age at the extremes indicating that the life system (i.e., population) is able to eliminate earlier failure and/or to keep later failure rates constant. The applied model was able to identify the major correlates of extended longevity and to suggest new ideas for using demographic concepts in both basic and applied population biology and aging. PMID:26317217
Development of a Database to Support a Multi-Scale Analysis of the Distribution of Westslope ....................................................................................................................................5 Database Development was filed with U.S. Fish and Wildlife Service to list WCT as threatened under the Endangered Species Act
Characterizing the distribution of an endangered salmonid using environmental DNA analysis
Laramie, Matthew; Pilliod, David S.; Goldberg, Caren S.
2015-01-01
Determining species distributions accurately is crucial to developing conservation and management strategies for imperiled species, but a challenging task for small populations. We evaluated the efficacy of environmental DNA (eDNA) analysis for improving detection and thus potentially refining the known distribution of Chinook salmon (Oncorhynchus tshawytscha) in the Methow and Okanogan Subbasins of the Upper Columbia River, which span the border between Washington, USA and British Columbia, Canada. We developed an assay to target a 90 base pair sequence of Chinook DNA and used quantitative polymerase chain reaction (qPCR) to quantify the amount of Chinook eDNA in triplicate 1-L water samples collected at 48 stream locations in June and again in August 2012. The overall probability of detecting Chinook with our eDNA method in areas within the known distribution was 0.77 (±0.05 SE). Detection probability was lower in June (0.62, ±0.08 SE) during high flows and at the beginning of spring Chinook migration than during base flows in August (0.93, ±0.04 SE). In the Methow subbasin, mean eDNA concentration was higher in August compared to June, especially in smaller tributaries, probably resulting from the arrival of spring Chinook adults, reduced discharge, or both. Chinook eDNA concentrations did not appear to change in the Okanogan subbasin from June to August. Contrary to our expectations about downstream eDNA accumulation, Chinook eDNA did not decrease in concentration in upstream reaches (0–120 km). Further examination of factors influencing spatial distribution of eDNA in lotic systems may allow for greater inference of local population densities along stream networks or watersheds. These results demonstrate the potential effectiveness of eDNA detection methods for determining landscape-level distribution of anadromous salmonids in large river systems.
Analysis and improvement of data-set level file distribution in Disk Pool Manager
NASA Astrophysics Data System (ADS)
Cadellin Skipsey, Samuel; Purdie, Stuart; Britton, David; Mitchell, Mark; Bhimji, Wahid; Smith, David
2014-06-01
Of the three most widely used implementations of the WLCG Storage Element specification, Disk Pool Manager[1, 2] (DPM) has the simplest implementation of file placement balancing (StoRM doesn't attempt this, leaving it up to the underlying filesystem, which can be very sophisticated in itself). DPM uses a round-robin algorithm (with optional filesystem weighting), for placing files across filesystems and servers. This does a reasonable job of evenly distributing files across the storage array provided to it. However, it does not offer any guarantees of the evenness of distribution of that subset of files associated with a given "dataset" (which often maps onto a "directory" in the DPM namespace (DPNS)). It is useful to consider a concept of "balance", where an optimally balanced set of files indicates that the files are distributed evenly across all of the pool nodes. The best case performance of the round robin algorithm is to maintain balance, it has no mechanism to improve balance. In the past year or more, larger DPM sites have noticed load spikes on individual disk servers, and suspected that these were exacerbated by excesses of files from popular datasets on those servers. We present here a software tool which analyses file distribution for all datasets in a DPM SE, providing a measure of the poorness of file location in this context. Further, the tool provides a list of file movement actions which will improve dataset-level file distribution, and can action those file movements itself. We present results of such an analysis on the UKI-SCOTGRID-GLASGOW Production DPM.
Complete Distributed Hyper-Entangled-Bell-State Analysis and Quantum Super Dense Coding
NASA Astrophysics Data System (ADS)
Zheng, Chunhong; Gu, Yongjian; Li, Wendong; Wang, Zhaoming; Zhang, Jiying
2015-07-01
We propose a protocol to implement the distributed hyper-entangled-Bell-state analysis (HBSA) for photonic qubits with weak cross-Kerr nonlinearities, QND photon-number-resolving detection, and some linear optical elements. The distinct feature of our scheme is that the BSA for two different degrees of freedom can be implemented deterministically and nondestructively. Based on the present HBSA, we achieve quantum super dense coding with double information capacity, which makes our scheme more significant for long-distance quantum communication.
Oliver García, Elena; González de la Hoz, Santiago
The first study of jet substructure on LHC data was performed by the ATLAS experiment. The jet algorithm chosen was AntiKt with R-parameter=1.0. This study has been important to check the working of the substructure variables which allow to distinguish boosted objects from background. In this study, the computing part has had a great importance because the work done into the ATLAS Spanish Tier-2 federation on understanding its performance and its operations. This has allowed the access of hundred of million of events to obtain the results using Grid technologies for Distributed Analysis. Also, this activity helped in other physics studies of ATLAS experiment.
A Microscopic Analysis of the Carrier-Velocity Distribution and the Noise in FET Devices
NASA Astrophysics Data System (ADS)
Houlet, P.; Ueno, H.; Hamaguchi, C.; Vaissiere, J. C.; Nougier, J. P.; Varani, L.
1997-11-01
We present a compared microscopic analysis of the transport and noise performances of submicron GaAs MESFET and GaAs/AlGaAs HEMT. The calculations are performed using a Monte Carlo modeling including real-space transfer and quantized electrons in the HEMT. We have found a much lower drain-current variance in the HEMT that we link to a very sharp carrier-velocity distribution at the beginning of the channel and a shorter average carrier transit time. Moreover, the average carrier transit time exhibits for both devices a minimum value just at the beginning of the saturation regime.
Analysis of counterfactual quantum key distribution using error-correcting theory
NASA Astrophysics Data System (ADS)
Li, Yan-Bing
2014-10-01
Counterfactual quantum key distribution is an interesting direction in quantum cryptography and has been realized by some researchers. However, it has been pointed that its insecure in information theory when it is used over a high lossy channel. In this paper, we retry its security from a error-correcting theory point of view. The analysis indicates that the security flaw comes from the reason that the error rate in the users' raw key pair is as high as that under the Eve's attack when the loss rate exceeds 50 %.
Energy distribution analysis of the wavepacket simulations of $CH_{4}$ and $CD_{4}$ scattering
Milot, R
2000-01-01
The isotope effect in the scattering of methane is studied by wavepacket simulations of oriented CH4 and CD4 molecules from a flat surface including all nine internal vibrations. At a translational energy up to 96 kJ/mol we find that the scattering is still predominantly elastic, but less so for CD4. Energy distribution analysis of the kinetic energy per mode and the potential energy surface terms, when the molecule hits the surface, are used in combination with vibrational excitations and the corresponding deformation. They indicate that the orientation with three bonds pointing towards the surface is mostly responsible for the isotope effect in the methane dissociation.
Gibbs distribution analysis of temporal correlations structure in retina ganglion cells
Vasquez, J C; Palacios, A G; Berry, M J; Cessac, B
2011-01-01
We present a method to estimate Gibbs distributions with \\textit{spatio-temporal} constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (\\cite{marre-boustani-etal:09}) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions.
Gibbs distribution analysis of temporal correlations structure in retina ganglion cells.
Vasquez, J C; Marre, O; Palacios, A G; Berry, M J; Cessac, B
2012-01-01
We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (Marre et al., 2009) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions. PMID:22115900
Human and climate impact on global riverine water and sediment fluxes - a distributed analysis
NASA Astrophysics Data System (ADS)
Cohen, S.; Kettner, A.; Syvitski, J. P.
2013-05-01
Understanding riverine water and sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of climate, landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. The intensity and dynamics between man-made and climatic factors vary widely across the globe and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment and water discharge model (WBMsed) to simulate human and climate effect on our planet's large rivers.
Torsten Gogolla; Katerina Krebber
2000-01-01
We present a method for distributed measurement of beat length, differential group delay, strain, and temperature in long length single-mode optical fibers. Toward this aim, we employ the polarization state sensitive effect of stimulated Brillouin scattering (SBS). The distributed measurement is realized by applying frequency-domain analysis. We present the analytical relationships between the Brillouin interaction of two counterpropagating waves in
Morris, David Gordon
1966-01-01
AN ANALYSIS OF THE DISTRIBUTION OF RAINFALL AND SOME RAINFALL ASSOCIATIONS FOR SELECTED STATIONS IN WESTERN COLOMBIA A Thesis By DAVID GORDON MORRIS Submitted to the Graduate College of the Texas A&X University in partial fulfillment... of the requirements for the degree of MASTER OF SCIENCE May, 1966 Major Subject: Meteorology AN ANALYSIS OF THE DISTRIBUTION OF RAINFALI AND SOME RAINFALL ASSOCIATIONS FOR SELECTED STATIONS IN WESTERN COLOMBIA A Thesis By DAVID GORDON MORRIS Approved...
Boetker, Johan P.; Koradia, Vishal; Rades, Thomas; Rantanen, Jukka; Savolainen, Marja
2012-01-01
Amlodipine besilate, a calcium channel antagonist, exists in several solid forms. Processing of anhydrate and dihydrate forms of this drug may lead to solid state changes, and is therefore the focus of this study. Milling was performed for the anhydrate form, whereas the dihydrate form was subjected to quench cooling thereby creating an amorphous form of the drug from both starting materials. The milled and quench cooled samples were, together with the crystalline starting materials, analyzed with X-ray powder diffraction (XRPD), Raman spectroscopy and atomic pair-wise distribution function (PDF) analysis of the XRPD pattern. When compared to XRPD and Raman spectroscopy, the PDF analysis was superior in displaying the difference between the amorphous samples prepared by milling and quench cooling approaches of the two starting materials. PMID:24300182
NASA Astrophysics Data System (ADS)
Koizumi, Akira; Hayashi, Hiroaki; Arai, Yasuhiro; Inakazu, Toyono; Tamura, Satoshi; Ashida, Hiroshi
In this study, the general corrosion of water distribution pipes which shows the spread of outside cor rosion was analyzed using the field survey data which was collected by Tokyo Waterworks Bureau. At first, the factor relevance structure map was constructed using correlation analysis. Installation period and polyethylene sleeve were important factors for general corrosion. In addition, correlation between the survey data about soil environment such as pH and general corrosion was confirmed. Secondly, the diagnostic model for understanding general corrosion without digging out was constructed using quantification theory. As the result, the diagnostication of general corrosion was made possible by the survey data about installation period and polyethylene sleeve. Finally, using regression analysis, the transition over time of general corrosion was compared according to the condition of polyethylene sleeve and soil environment. Additionally, the uncertainness of general corrosion in future was grasped by extended prediction.
Finite element analysis of stress distribution on modified retentive tips of bar clasp.
Oyar, P; Soyarslan, C; Can, G; Demirci, E
2012-01-01
This study used finite element analysis to evaluate the retentive tips of bar clasps made from different alloys and using different designs in order to determine whether or not different materials and tip forms are suitable for bar clasp applications. Co-Cr, Ti and Type IV Au alloys were selected based on their physical and mechanical properties. The 3D finite element models of three different bar clasp retentive tip geometries prepared from Co-Cr, Ti and Type IV Au alloys were constructed using the finite element software package MSC.Marc. Analysis of a concentrated load of 5 N applied to the removable partial denture approach arms in an occlusal direction was performed. Although stress distribution and localisation within bar clasps with different retentive tips were observed to be similar and were concentrated in the approach arm, stress intensities differed in all models. PMID:21347911
Dai, Mi
2015-01-01
In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters. Applying this method to the Joint Lightcurve Analysis (JLA) data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best fit values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.
NASA Astrophysics Data System (ADS)
Alatorre-Ibarguengoitia, M. A.; Kueppers, U.; Delgado-Granados, H.; Dingwell, D. B.
2006-12-01
Dealing with hazards at active volcanoes requires detailed knowledge of eruptive history and a good understanding of pre- and syn-eruptive processes. Despite improvement in monitoring systems, such an understanding cannot be based on direct field observations alone. Experimental and theoretical modelling are two essential components of modern volcanic hazard analysis. Volcanic ballistic projectiles (VBP) are a major hazard related to volcanic explosions. They may affect people, ecology, infrastructure and aircraft. In order to determine the potential areas for VBP fall, it is needed to estimate maximum ranges under different explosive scenarios. Each scenario is defined by the kinetic energy calculated from the impact location and its dimension and the physical characteristics of the projectiles (e.g. density, drag coefficient). The kinetic energy derives from the excess pressure in the expanding volatile phase driving the explosion. The design and development of "fragmentation bomb" technology has provided volcanology with the capability of controlled and systematic analysis of the fragmentation behavior of magma upon rapid decompression. Study of samples from several volcanoes has demonstrated a close relationship between open porosity and overpressure required for complete fragmentation of samples (fragmentation threshold). Analysis of the experimentally generated pyroclasts by fractal analysis shows that grain-size distribution is linearly dependent on open porosity and PEF (potential energy for fragmentation). Combination of these two approaches (kinetic energy from the distribution of VBP and potential energy (PEF) from scaled experiments and investigation of experimental and natural pyroclasts), taken together with seismic monitoring provides the potential for a significantly more refined hazard assessment of active, explosive volcanoes.
Chen Jiaqi; Andy Marvin; Ian Flintoft; John Dawson
2011-01-01
The statistics of the re-radiated spectrum from two correlated non-linear devices are investigated in a reverberation chamber. The distribution of the mean- value- normalized statistics is interpreted using a double-Weibull statistical model. Comparisons are made with the re-radiation spectrum of a single non-linear device showing the statistical distributions to be different. Furthermore, experiments indicate the spatial correlation between the two
An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.
2009-07-01
A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.
Lindgren, Cecilia M; Heid, Iris M; Randall, Joshua C; Lamina, Claudia; Steinthorsdottir, Valgerdur; Qi, Lu; Speliotes, Elizabeth K; Thorleifsson, Gudmar; Willer, Cristen J; Herrera, Blanca M; Jackson, Anne U; Lim, Noha; Scheet, Paul; Soranzo, Nicole; Amin, Najaf; Aulchenko, Yurii S; Chambers, John C; Drong, Alexander; Luan, Jian'an; Lyon, Helen N; Rivadeneira, Fernando; Sanna, Serena; Timpson, Nicholas J; Zillikens, M Carola; Zhao, Jing Hua; Almgren, Peter; Bandinelli, Stefania; Bennett, Amanda J; Bergman, Richard N; Bonnycastle, Lori L; Bumpstead, Suzannah J; Chanock, Stephen J; Cherkas, Lynn; Chines, Peter; Coin, Lachlan; Cooper, Cyrus; Crawford, Gabriel; Doering, Angela; Dominiczak, Anna; Doney, Alex S F; Ebrahim, Shah; Elliott, Paul; Erdos, Michael R; Estrada, Karol; Ferrucci, Luigi; Fischer, Guido; Forouhi, Nita G; Gieger, Christian; Grallert, Harald; Groves, Christopher J; Grundy, Scott; Guiducci, Candace; Hadley, David; Hamsten, Anders; Havulinna, Aki S; Hofman, Albert; Holle, Rolf; Holloway, John W; Illig, Thomas; Isomaa, Bo; Jacobs, Leonie C; Jameson, Karen; Jousilahti, Pekka; Karpe, Fredrik; Kuusisto, Johanna; Laitinen, Jaana; Lathrop, G Mark; Lawlor, Debbie A; Mangino, Massimo; McArdle, Wendy L; Meitinger, Thomas; Morken, Mario A; Morris, Andrew P; Munroe, Patricia; Narisu, Narisu; Nordström, Anna; Nordström, Peter; Oostra, Ben A; Palmer, Colin N A; Payne, Felicity; Peden, John F; Prokopenko, Inga; Renström, Frida; Ruokonen, Aimo; Salomaa, Veikko; Sandhu, Manjinder S; Scott, Laura J; Scuteri, Angelo; Silander, Kaisa; Song, Kijoung; Yuan, Xin; Stringham, Heather M; Swift, Amy J; Tuomi, Tiinamaija; Uda, Manuela; Vollenweider, Peter; Waeber, Gerard; Wallace, Chris; Walters, G Bragi; Weedon, Michael N; Witteman, Jacqueline C M; Zhang, Cuilin; Zhang, Weihua; Caulfield, Mark J; Collins, Francis S; Davey Smith, George; Day, Ian N M; Franks, Paul W; Hattersley, Andrew T; Hu, Frank B; Jarvelin, Marjo-Riitta; Kong, Augustine; Kooner, Jaspal S; Laakso, Markku; Lakatta, Edward; Mooser, Vincent; Morris, Andrew D; Peltonen, Leena; Samani, Nilesh J; Spector, Timothy D; Strachan, David P; Tanaka, Toshiko; Tuomilehto, Jaakko; Uitterlinden, André G; van Duijn, Cornelia M; Wareham, Nicholas J; Hugh Watkins; Waterworth, Dawn M; Boehnke, Michael; Deloukas, Panos; Groop, Leif; Hunter, David J; Thorsteinsdottir, Unnur; Schlessinger, David; Wichmann, H-Erich; Frayling, Timothy M; Abecasis, Gonçalo R; Hirschhorn, Joel N; Loos, Ruth J F; Stefansson, Kari; Mohlke, Karen L; Barroso, Inês; McCarthy, Mark I
2009-06-01
To identify genetic loci influencing central obesity and fat distribution, we performed a meta-analysis of 16 genome-wide association studies (GWAS, N = 38,580) informative for adult waist circumference (WC) and waist-hip ratio (WHR). We selected 26 SNPs for follow-up, for which the evidence of association with measures of central adiposity (WC and/or WHR) was strong and disproportionate to that for overall adiposity or height. Follow-up studies in a maximum of 70,689 individuals identified two loci strongly associated with measures of central adiposity; these map near TFAP2B (WC, P = 1.9x10(-11)) and MSRA (WC, P = 8.9x10(-9)). A third locus, near LYPLAL1, was associated with WHR in women only (P = 2.6x10(-8)). The variants near TFAP2B appear to influence central adiposity through an effect on overall obesity/fat-mass, whereas LYPLAL1 displays a strong female-only association with fat distribution. By focusing on anthropometric measures of central obesity and fat distribution, we have identified three loci implicated in the regulation of human adiposity. PMID:19557161