Science.gov

Sample records for weibull distribution analysis

  1. /q-exponential, Weibull, and /q-Weibull distributions: an empirical analysis

    NASA Astrophysics Data System (ADS)

    Picoli, S.; Mendes, R. S.; Malacarne, L. C.

    2003-06-01

    In a comparative study, the q-exponential and Weibull distributions are employed to investigate frequency distributions of basketball baskets, cyclone victims, brand-name drugs by retail sales, and highway length. In order to analyze the intermediate cases, a distribution, the q-Weibull one, which interpolates the q-exponential and Weibull ones, is introduced. It is verified that the basketball baskets distribution is well described by a q-exponential, whereas the cyclone victims and brand-name drugs by retail sales ones are better adjusted by a Weibull distribution. On the other hand, for highway length the q-exponential and Weibull distributions do not give satisfactory adjustment, being necessary to employ the q-Weibull distribution. Furthermore, the introduction of this interpolating distribution gives an illumination from the point of view of the stretched exponential against inverse power law ( q-exponential with q>1) controversy.

  2. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  3. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-parameter Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  4. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  5. Statistical analysis of censored motion sickness latency data using the two-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Park, Won J.; Crampton, George H.

    1988-01-01

    The suitability of the two-parameter Weibull distribution for describing highly censored cat motion sickness latency data was evaluated by estimating the parameters with the maximum likelihood method and testing for goodness of fit with the Kolmogorov-Smirnov statistic. A procedure for determining confidence levels and testing for significance of the difference between Weibull parameters is described. Computer programs for these procedures may be obtained from an archival source.

  6. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  7. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  8. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  9. ZERODUR strength modeling with Weibull statistical distributions

    NASA Astrophysics Data System (ADS)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a

  10. Application of Weibull analysis to SSME hardware

    NASA Technical Reports Server (NTRS)

    Gray, L. A. B.

    1986-01-01

    Generally, it has been documented that the wearing of engine parts forms a failure distribution which can be approximated by a function developed by Weibull. The purpose here is to examine to what extent the Weibull distribution approximates failure data for designated engine parts of the Space Shuttle Main Engine (SSME). The current testing certification requirements will be examined in order to establish confidence levels. An examination of the failure history of SSME parts/assemblies (turbine blades, main combustion chamber, or high pressure fuel pump first stage impellers) which are limited in usage by time or starts will be done by using updated Weibull techniques. Efforts will be made by the investigator to predict failure trends by using Weibull techniques for SSME parts (turbine temperature sensors, chamber pressure transducers, actuators, and controllers) which are not severely limited by time or starts.

  11. Distributed Fuzzy CFAR Detection for Weibull Clutter

    NASA Astrophysics Data System (ADS)

    Zaimbashi, Amir; Taban, Mohammad Reza; Nayebi, Mohammad Mehdi

    In Distributed detection systems, restricting the output of the local decision to one bit certainly implies a substantial information loss. In this paper, we consider the fuzzy detection, which uses a function called membership function for mapping the observation space of each local detector to a value between 0 and 1, indicating the degree of assurance about presence or absence of a signal. In this case, we examine the problem of distributed Maximum Likelihood (ML) and Order Statistic (OS) constant false alarm rate (CFAR) detections using fuzzy fusion rules such as “Algebraic Product” (AP), “Algebraic Sum” (AS), “Union” (Un) and “Intersection” (IS) in the fusion centre. For the Weibull clutter, the expression of the membership function based on the ML or OS CFAR processors in the local detectors is also obtained. For comparison, we consider a binary distributed detector, which uses the Maximum Likelihood and Algebraic Product (MLAP) or Order Statistic and Algebraic Product (OSAP) CFAR processors as the local detectors. In homogenous and non homogenous situations, multiple targets or clutter edge, the performances of the fuzzy and binary distributed detectors are analyzed and compared. The simulation results indicate the superior and robust performance of the distributed systems using fuzzy detection in the homogenous and non homogenous situations.

  12. Weibull model of multiplicity distribution in hadron-hadron collisions

    NASA Astrophysics Data System (ADS)

    Dash, Sadhana; Nandi, Basanta K.; Sett, Priyanka

    2016-06-01

    We introduce the use of the Weibull distribution as a simple parametrization of charged particle multiplicities in hadron-hadron collisions at all available energies, ranging from ISR energies to the most recent LHC energies. In statistics, the Weibull distribution has wide applicability in natural processes that involve fragmentation processes. This provides a natural connection to the available state-of-the-art models for multiparticle production in hadron-hadron collisions, which involve QCD parton fragmentation and hadronization. The Weibull distribution describes the multiplicity data at the most recent LHC energies better than the single negative binomial distribution.

  13. A Weibull distribution accrual failure detector for cloud computing

    PubMed Central

    Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229

  14. A Weibull distribution accrual failure detector for cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  15. Program for Weibull Analysis of Fatigue Data

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2005-01-01

    A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

  16. Weibull distribution based on maximum likelihood with interval inspection data

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.

    1985-01-01

    The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

  17. Modified Goodness-of-Fit Tests for the Weibull Distribution

    DTIC Science & Technology

    1993-03-01

    comparison. 1.6 Summary Most of the books about statistics do not include enough information how to choose distributions to model the system bebavior and...the system modelled. If one does not have enough information about how to choose distributions or does not test the distribution cho- sen then the... systems . Because in the litterature and in the real life when using Weibull distribution as a model analysts consider minimum life of product as zero

  18. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  19. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, F. A., Jr.; Zaretsky, E. V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodolgy based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  20. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  1. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  2. Analysis of tensile bond strengths using Weibull statistics.

    PubMed

    Burrow, Michael F; Thomas, David; Swain, Mike V; Tyas, Martin J

    2004-09-01

    Tensile strength tests of restorative resins bonded to dentin, and the resultant strengths of interfaces between the two, exhibit wide variability. Many variables can affect test results, including specimen preparation and storage, test rig design and experimental technique. However, the more fundamental source of variability, that associated with the brittle nature of the materials, has received little attention. This paper analyzes results from micro-tensile tests on unfilled resins and adhesive bonds between restorative resin composite and dentin in terms of reliability using the Weibull probability of failure method. Results for the tensile strengths of Scotchbond Multipurpose Adhesive (3M) and Clearfil LB Bond (Kuraray) bonding resins showed Weibull moduli (m) of 6.17 (95% confidence interval, 5.25-7.19) and 5.01 (95% confidence interval, 4.23-5.8). Analysis of results for micro-tensile tests on bond strengths to dentin gave moduli between 1.81 (Clearfil Liner Bond 2V) and 4.99 (Gluma One Bond, Kulzer). Material systems with m in this range do not have a well-defined strength. The Weibull approach also enables the size dependence of the strength to be estimated. An example where the bonding area was changed from 3.1 to 1.1 mm diameter is shown. Weibull analysis provides a method for determining the reliability of strength measurements in the analysis of data from bond strength and tensile tests on dental restorative materials.

  3. Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

    2012-01-01

    A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

  4. A Weibull distribution with power-law tails that describes the first passage time processes of foreign currency exchanges

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya; Inoue, Jun-Ichi

    2007-03-01

    A Weibull distribution with power-law tails is confirmed as a good candidate to describe the first passage time process of foreign currency exchange rates. The Lorentz curve and the corresponding Gini coefficient for a Weibull distribution are derived analytically. We show that the coefficient is in good agreement with the same quantity calculated from the empirical data. We also calculate the average waiting time which is an important measure to estimate the time for customers to wait until the next price change after they login to their computer systems. By assuming that the first passage time distribution might change its shape from the Weibull to the power-law at some critical time, we evaluate the averaged waiting time by means of the renewal-reward theorem. We find that our correction of tails of the distribution makes the averaged waiting time much closer to the value obtained from empirical data analysis. We also discuss the deviation from the estimated average waiting time by deriving the waiting time distribution directly. These results make us conclude that the first passage process of the foreign currency exchange rates is well described by a Weibull distribution with power-law tails.

  5. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  6. Least Squares Best Fit Method for the Three Parameter Weibull Distribution: Analysis of Tensile and Bend Specimens with Volume or Surface Flaw Failure

    NASA Technical Reports Server (NTRS)

    Gross, Bernard

    1996-01-01

    Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

  7. Predictive Failure of Cylindrical Coatings Using Weibull Analysis

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2002-01-01

    Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.

  8. Weibull wind speed distribution: Numerical considerations and use with sodar data

    NASA Astrophysics Data System (ADS)

    PéRez, Isidro A.; SáNchez, M. Luisa; GarcíA, M. ÁNgeles

    2007-10-01

    Two analyses have been performed of the use of the Weibull distribution to describe wind speed statistics. The first is a combination of theoretical considerations in a common domain of c and k parameters concerning some robust indicators of position, spread, skewness, and kurtosis. The second is a calculation of the Weibull parameters using three differing methods based on a 3-a sodar database. The modified maximum-likelihood method is direct, the method of weighted probability moments considers order statistics, and the method based on the minimum RMSE is iterative. As a result of the theoretical analyses, we propose some simple relationships involving Weibull parameters and the range of a fraction of central data, the variation coefficient, and the Yule-Kendall index, which may be applied practically. The calculation of Weibull parameters has revealed the sharp contrast between day, where the fit was highly satisfactory, and night, mainly below 300 m. Moreover, a seasonal pattern was also observed. The comparison between the methods used also proved satisfactory, particularly during day, whereas a slight disagreement was observed during the night for the method based on the minimum RMSE. Finally, randomly generated samples were used to check the accuracy of the Weibull parameters in the domain analyzed, resulting in small residuals and standard deviations of the values calculated.

  9. An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

    ERIC Educational Resources Information Center

    Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

    2005-01-01

    An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

  10. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

    PubMed Central

    Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

  11. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength.

    PubMed

    Krumbholz, Michael; Hieronymus, Christoph F; Burchardt, Steffi; Troll, Valentin R; Tanner, David C; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth's crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution.

  12. Weibull statistical analysis of tensile strength of vascular bundle in inner layer of moso bamboo culm in molecular parasitology and vector biology.

    PubMed

    Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

    2014-07-01

    Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.

  13. Bonus-Malus System with the Claim Frequency Distribution is Geometric and the Severity Distribution is Truncated Weibull

    NASA Astrophysics Data System (ADS)

    Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.

    2016-01-01

    Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.

  14. An attribute control chart for a Weibull distribution under accelerated hybrid censoring

    PubMed Central

    Aslam, Muhammad; Arif, Osama H.; Jun, Chi-Hyuck

    2017-01-01

    In this article, an attribute control chart has been proposed using the accelerated hybrid censoring logic for the monitoring of defective items whose life follows a Weibull distribution. The product can be tested by introducing the acceleration factor based on different pressurized conditions such as stress, load, strain, temperature, etc. The control limits are derived based on the binomial distribution, but the fraction defective is expressed only through the shape parameter, the acceleration factor and the test duration constant. Tables of the average run lengths have been generated for different process parameters to assess the performance of the proposed control chart. Simulation studies have been performed for the practical use, where the proposed chart is compared with the Shewhart np chart for demonstration of the detection power of a process shift. PMID:28257479

  15. A Bayesian estimation on right censored survival data with mixture and non-mixture cured fraction model based on Beta-Weibull distribution

    NASA Astrophysics Data System (ADS)

    Yusuf, Madaki Umar; Bakar, Mohd. Rizam B. Abu

    2016-06-01

    Models for survival data that includes the proportion of individuals who are not subject to the event under study are known as a cure fraction models or simply called long-term survival models. The two most common models used to estimate the cure fraction are the mixture model and the non-mixture model. in this work, we present mixture and the non-mixture cure fraction models for survival data based on the beta-Weibull distribution. This four parameter distribution has been proposed as an alternative extension of the Weibull distribution in the analysis of lifetime data. This approach allows the inclusion of covariates in the models, where the estimation of the parameters was obtained under a Bayesian approach using Gibbs sampling methods.

  16. Reliability Evaluation Method with Weibull Distribution for Temporary Overvoltages of Substation Equipment

    NASA Astrophysics Data System (ADS)

    Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun

    The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.

  17. Weibull mixture model for isoconversional kinetic analysis of biomass oxidative pyrolysis

    NASA Astrophysics Data System (ADS)

    Cai, J. M.; Chen, S. Y.

    2010-03-01

    In this work, the possibility of applying the weighted sum of three cumulative Weibull distribution functions for the fitting of the kinetic conversion data of biomass oxidative pyrolysis has been investigated. The kinetic conversion data of the thermal decomposition of olive oil solid waste in oxygen atmosphere for different heating rates have been analyzed. The results have shown that the experimental data can be perfectly reproduced by the general fitting function. Therefore, it is possible to obtain the corresponding conversion rate values of biomass oxidative pyrolysis by differentiating directly the fitted kinetic conversion data. Additionally, the logistic mixture model has been applied to the same experimental data. It can be found that the newly proposed function can provide a better fit of the data than the logistic mixture model. Based on the fitting of Weibull mixture model, the kinetic triples (E, A and f(α)) of oxidative pyrolysis of olive solid waste were obtained by means of Friedman's differential isoconversional method.

  18. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.

    2007-01-01

    Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.

  19. Statistical analysis of bivariate failure time data with Marshall-Olkin Weibull models.

    PubMed

    Li, Yang; Sun, Jianguo; Song, Shuguang

    2012-06-01

    This paper discusses parametric analysis of bivariate failure time data, which often occur in medical studies among others. For this, as in the case of univariate failure time data, exponential and Weibull models are probably the most commonly used ones. However, it is surprising that there seem no general estimation procedures available for fitting the bivariate Weibull model to bivariate right-censored failure time data except some methods for special situations. We present and investigate two general but simple estimation procedures, one being a graphical approach and the other being a marginal approach, for the problem. An extensive simulation study is conducted to assess the performances of the proposed approaches and shows that they work well for practical situations. An illustrative example is provided.

  20. USE OF WEIBULL FUNCTION FOR NON-LINEAR ANALYSIS OF EFFECTS OF LOW LEVELS OF SIMULATED HERBICIDE DRIFT ON PLANTS

    EPA Science Inventory

    We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

  1. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2013-01-01

    Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

  2. Accurate bearing remaining useful life prediction based on Weibull distribution and artificial neural network

    NASA Astrophysics Data System (ADS)

    Ben Ali, Jaouher; Chebel-Morello, Brigitte; Saidi, Lotfi; Malinowski, Simon; Fnaiech, Farhat

    2015-05-01

    Accurate remaining useful life (RUL) prediction of critical assets is an important challenge in condition based maintenance to improve reliability and decrease machine's breakdown and maintenance's cost. Bearing is one of the most important components in industries which need to be monitored and the user should predict its RUL. The challenge of this study is to propose an original feature able to evaluate the health state of bearings and to estimate their RUL by Prognostics and Health Management (PHM) techniques. In this paper, the proposed method is based on the data-driven prognostic approach. The combination of Simplified Fuzzy Adaptive Resonance Theory Map (SFAM) neural network and Weibull distribution (WD) is explored. WD is used just in the training phase to fit measurement and to avoid areas of fluctuation in the time domain. SFAM training process is based on fitted measurements at present and previous inspection time points as input. However, the SFAM testing process is based on real measurements at present and previous inspections. Thanks to the fuzzy learning process, SFAM has an important ability and a good performance to learn nonlinear time series. As output, seven classes are defined; healthy bearing and six states for bearing degradation. In order to find the optimal RUL prediction, a smoothing phase is proposed in this paper. Experimental results show that the proposed method can reliably predict the RUL of rolling element bearings (REBs) based on vibration signals. The proposed prediction approach can be applied to prognostic other various mechanical assets.

  3. Average capacity for optical wireless communication systems over exponentiated Weibull distribution non-Kolmogorov turbulent channels.

    PubMed

    Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

    2014-06-20

    We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.

  4. Mixture and non-mixture cure fraction models based on the generalized modified Weibull distribution with an application to gastric cancer data.

    PubMed

    Martinez, Edson Z; Achcar, Jorge A; Jácome, Alexandre A A; Santos, José S

    2013-12-01

    The cure fraction models are usually used to model lifetime time data with long-term survivors. In the present article, we introduce a Bayesian analysis of the four-parameter generalized modified Weibull (GMW) distribution in presence of cure fraction, censored data and covariates. In order to include the proportion of "cured" patients, mixture and non-mixture formulation models are considered. To demonstrate the ability of using this model in the analysis of real data, we consider an application to data from patients with gastric adenocarcinoma. Inferences are obtained by using MCMC (Markov Chain Monte Carlo) methods.

  5. A practical simulation method to calculate sample size of group sequential trials for time-to-event data under exponential and Weibull distribution.

    PubMed

    Jiang, Zhiwei; Wang, Ling; Li, Chanjuan; Xia, Jielai; Jia, Hongxia

    2012-01-01

    Group sequential design has been widely applied in clinical trials in the past few decades. The sample size estimation is a vital concern of sponsors and investigators. Especially in the survival group sequential trials, it is a thorny question because of its ambiguous distributional form, censored data and different definition of information time. A practical and easy-to-use simulation-based method is proposed for multi-stage two-arm survival group sequential design in the article and its SAS program is available. Besides the exponential distribution, which is usually assumed for survival data, the Weibull distribution is considered here. The incorporation of the probability of discontinuation in the simulation leads to the more accurate estimate. The assessment indexes calculated in the simulation are helpful to the determination of number and timing of the interim analysis. The use of the method in the survival group sequential trials is illustrated and the effects of the varied shape parameter on the sample size under the Weibull distribution are explored by employing an example. According to the simulation results, a method to estimate the shape parameter of the Weibull distribution is proposed based on the median survival time of the test drug and the hazard ratio, which are prespecified by the investigators and other participants. 10+ simulations are recommended to achieve the robust estimate of the sample size. Furthermore, the method is still applicable in adaptive design if the strategy of sample size scheme determination is adopted when designing or the minor modifications on the program are made.

  6. Regional flood frequency analysis based on a Weibull model: Part 1. Estimation and asymptotic variances

    NASA Astrophysics Data System (ADS)

    Heo, Jun-Haeng; Boes, D. C.; Salas, J. D.

    2001-02-01

    Parameter estimation in a regional flood frequency setting, based on a Weibull model, is revisited. A two parameter Weibull distribution at each site, with common shape parameter over sites that is rationalized by a flood index assumption, and with independence in space and time, is assumed. The estimation techniques of method of moments and method of probability weighted moments are studied by proposing a family of estimators for each technique and deriving the asymptotic variance of each estimator. Then a single estimator and its asymptotic variance for each technique, suggested by trying to minimize the asymptotic variance over the family of estimators, is obtained. These asymptotic variances are compared to the Cramer-Rao Lower Bound, which is known to be the asymptotic variance of the maximum likelihood estimator. A companion paper considers the application of this model and these estimation techniques to a real data set. It includes a simulation study designed to indicate the sample size required for compatibility of the asymptotic results to fixed sample sizes.

  7. Improvement in mechanical properties of jute fibres through mild alkali treatment as demonstrated by utilisation of the Weibull distribution model.

    PubMed

    Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam

    2012-03-01

    Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%.

  8. An incentive for coordination in a decentralised service chain with a Weibull lifetime distributed facility

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Fang; Yang, Gino K.; Yang, Chyn-Yng; Chu, Tu-Bin

    2013-10-01

    This article deals with a decentralised service chain consisting of a service provider and a facility owner. The revenue allocation and service price are, respectively, determined by the service provider and the facility owner in a non-cooperative manner. To model this decentralised operation, a Stackelberg game between the two parties is formulated. In the mathematical framework, the service system is assumed to be driven by Poisson customer arrivals and exponential service times. The most common log-linear service demand and Weibull facility lifetime are also adopted. Under these analytical conditions, the decentralised decisions in this game are investigated and then a unique optimal equilibrium is derived. Finally, a coordination mechanism is proposed to improve the efficiency of this decentralised system.

  9. Weibull-Based Parts Failure Analysis Computer Program User’s Manual

    DTIC Science & Technology

    1989-01-25

    PARAMETER CALCULATION AND SAVE INPUT DATA ( PWA CO) 2. WEIBULL PARAMETER CALCULATION W/MAX. LIKELIHOOD VALUES ( PWA ) 3. CHARACTERISTIC LIFE CALCULATION...461 . 524 .575 .616 .650 .679 18 .016 .128 .254 .358 .439 .504 .556 .598 .633 .663 20 .013 .115 .237 .339 .421 .486 .539 .582 .619 .649 25 .008 .092 .204

  10. A practical and systematic review of Weibull statistics for reporting strengths of dental materials

    PubMed Central

    Quinn, George D.; Quinn, Janet B.

    2011-01-01

    Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745

  11. A comparative study of mixed exponential and Weibull distributions in a stochastic model replicating a tropical rainfall process

    NASA Astrophysics Data System (ADS)

    Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah

    2014-11-01

    A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.

  12. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  13. Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions

    DTIC Science & Technology

    2015-12-01

    Journal of Colloid and Interface Science, 22(2), 126–132, 1966 D. E. Grady and M. E. Kipp, Dynamic Rock Fragmentation, In B. K. Atkinson, Editor...Solutions to Smoluchowski’s Coagulation Equation with Gamma Distributions as Initial Size Spectra, Journal of Colloid and Interface Science, 283, 267...Journal of Colloid and Interface Science, 309, 440- 444, 2007 A. A. Lushnikov, Introduction to Aerosols, In I. Agranovski, Editor, Aerosols

  14. Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model: A complementary data set.

    PubMed

    Jo, Byung Wan; Chakraborty, Sumit; Kim, Heon

    2015-09-01

    This data article provides a comparison data for nano-cement based concrete (NCC) and ordinary Portland cement based concrete (OPCC). Concrete samples (OPCC) were fabricated using ten different mix design and their characterization data is provided here. Optimization of curing time using the Weibull distribution model was done by analyzing the rate of change of compressive strength of the OPCC. Initially, the compressive strength of the OPCC samples was measured after completion of four desired curing times. Thereafter, the required curing time to achieve a particular rate of change of the compressive strength has been predicted utilizing the equation derived from the variation of the rate of change of compressive strength with the curing time, prior to the optimization of the curing time (at the 99.99% confidence level) using the Weibull distribution model. This data article complements the research article entitled "Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model" [1].

  15. Statistical analysis of the strength of ultra-oriented ultra-high-molecular-weight polyethylene film filaments in the framework of the Weibull model

    NASA Astrophysics Data System (ADS)

    Boiko, Yu. M.; Marikhin, V. A.; Myasnikova, L. P.; Moskalyuk, O. A.; Radovanova, E. I.

    2016-10-01

    A statistical analysis of the distribution of the tensile strength σ of ultra-oriented ultra-high-molecular-weight polyethylene (UHMWPE) film filaments has been performed in the framework of the Weibull model using the results obtained from a large number (50) of measurements. The UHMWPE film filaments have been produced by means of high-temperature multistage zone drawing of xerogels prepared from 1.5% UHMWPE solutions in decalin. The Weibull modulus has been determined for this type of materials. It has been shown that, for the ultimate draw ratio λ = 120, the average tensile strength is equal to 4.7 GPa, which is significantly higher than the tensile strength σ = 3.5 GPa for commercial gel-spun UHMWPE fibers manufactured by the DSM Company (The Netherlands) and the Honeywell International Incorporation (United States). It has been demonstrated that, for 20% of the specimens thus prepared, the tensile strength reaches record-high values σ = 5.2-5.9 GPa.

  16. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning

    PubMed Central

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  17. Empirical model building based on Weibull distribution to describe the joint effect of pH and temperature on the thermal resistance of Bacillus cereus in vegetable substrate.

    PubMed

    Fernández, A; Collado, J; Cunha, L M; Ocio, M J; Martínez, A

    2002-07-25

    A mathematical model based on Weibull parameters was built to describe the joint effect of temperature and pH on thermal inactivation of Bacillus cereus spores (strain INRA TZ415). The effect of these factors on Weibull model parameters (beta, 1/alpha) was also studied. Heat inactivation tests were carried out in acidified carrot broth as vegetable substrate, following a full factorial design at four levels for temperature (80, 85, 90 and 95 degrees C) and pH (6.2, 5.8, 5.2 and 4.7). The Weibull distribution model provided good individual fits for the different combinations of temperature-pH tested, with discrepancy factors, Df, coming close to 25% for most cases. The temperature and pH did not have a significant effect on the shape parameter (beta), which yielded a mean value of 0.88. The scale parameter (alpha) decreased with pH, and its inverse (1/alpha) followed an Arrhenius-type relationship with temperature. A global model was built, including the dependence of the alpha parameter on temperature and pH, and the model parameters were estimated by using a one-step nonlinear least-squares regression to improve the precision of the estimates. Results indicated that the global model provides a satisfactory description of the thermal inactivation of B. cereus spores, with R2 equal to 0.983.

  18. Estimating Age Distributions of Base Flow in Watersheds Underlain by Single and Dual Porosity Formations Using Groundwater Transport Simulation and Weighted Weibull Functions

    NASA Astrophysics Data System (ADS)

    Sanford, W. E.

    2015-12-01

    Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially

  19. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    SciTech Connect

    Vachon, W.A.

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  20. SER performance analysis of MPPM FSO system with three decision thresholds over exponentiated Weibull fading channels

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Yang, Bensheng; Guo, Lixin; Shang, Tao

    2015-11-01

    In this work, the symbol error rate (SER) performance of the multiple pulse position modulation (MPPM) based free-space optical communication (FSO) system with three different decision thresholds, fixed decision threshold (FDT), optimized decision threshold (ODT) and dynamic decision threshold (DDT) over exponentiated Weibull (EW) fading channels has been investigated in detail. The effects of aperture averaging on each decision threshold under weak-to-strong turbulence conditions are further studied and compared. The closed-form SER expressions for three thresholds derived with the help of generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulations. This work is helpful for the design of receivers for FSO communication systems.

  1. Evaluation of uncertainty in experimental active buckling control of a slender beam-column with disturbance forces using Weibull analysis

    NASA Astrophysics Data System (ADS)

    Enss, Georg C.; Platz, Roland

    2016-10-01

    Buckling of slender load-bearing beam-columns is a crucial failure scenario in light-weight structures as it may result in the collapse of the entire structure. If axial load and load capacity are unknown, stability becomes uncertain. To compensate this uncertainty, the authors successfully developed and evaluated an approach for active buckling control for a slender beam-column, clamped at the base and pinned at the upper end. Active lateral forces are applied with two piezoelectric stack actuators in opposing directions near the beam-column' clamped base to prevent buckling. A Linear Quadratic Regulator is designed and implemented on the experimental demonstrator and statistical tests are conducted to prove effectivity of the active approach. The load capacity of the beam-column could be increased by 40% and scatter of buckling occurrences for increasing axial loads is reduced. Weibull analysis is used to evaluate the increase of the load capacity and its related uncertainty compensation.

  2. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  3. Weibull Wind-Speed Distribution Parameters Derived from a Combination of Wind-Lidar and Tall-Mast Measurements Over Land, Coastal and Marine Sites

    NASA Astrophysics Data System (ADS)

    Gryning, Sven-Erik; Floors, Rogier; Peña, Alfredo; Batchvarova, Ekaterina; Brümmer, Burghard

    2016-05-01

    Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is expressed in terms of mean wind speed and Weibull distribution shape-parameter profiles. The consequences of the carrier-to-noise-ratio ( CNR) threshold-value choice on the wind-lidar observations are revealed as follows. When the wind-lidar CNR is lower than a prescribed threshold value, the observations are often filtered out as the uncertainty in the wind-speed measurements increases. For a pulsed heterodyne Doppler lidar, use of the traditional -22 dB CNR threshold value at all measuring levels up to 600 m results in a ≈ 7 % overestimation in the long-term mean wind speed over land, and a ≈ 12 % overestimation in coastal and marine environments. In addition, the height of the profile maximum of the shape parameter of the Weibull distribution (so-called reversal height) is found to depend on the applied CNR threshold; it is found to be lower at small CNR threshold values. The reversal height is greater in the suburban (high roughness) than in the rural (low roughness) area. In coastal areas the reversal height is lower than that over land and relates to the internal boundary layer that develops downwind from the coastline. Over the sea the shape parameter increases towards the sea surface. A parametrization of the vertical profile of the shape parameter fits well with observations over land, coastal regions and over the sea. An applied model for the dependence of the reversal height on the surface roughness is in good agreement with the observations over land.

  4. Multiaxial loading fracture of A1/sub 2/O/sub 3/ tubes: II. Weibull theory and analysis

    SciTech Connect

    Petrovic, J.J.; Stout, M.G.

    1984-01-01

    The Weibull statistical fracture theory for multiaxial loading was developed for thick- and thin-walled tube geometries subjected to multiaxial loading in tension-internal pressure, compression-internal pressure, and pure torsion. As compared to uniaxial tension, lower strengths are predicted for tension-tension stress states and higher strengths for tension-compression stress states. Comparison to experimental results for A1/sub 2/O/sub 3/ tubes indicates a reasonable agreement with Weibull theory predictions for tension-internal pressure and compression-internal pressure conditions, but an underestimation of stress state effects in pure torsion. Results indicate a weakening effect of in-flaw-plane tensile stresses, with no observed influence of in-plane compressive stresses.

  5. Weibull Analysis Handbook

    DTIC Science & Technology

    1983-11-01

    model matkes a predesignated number of passes through the life cycle (20 years) and Creplorts the average of the passes by report period. The number...number of failures expected in the year ahead. Failure times are gen erated for each member ofl the fleet using random numbers and the inpu~t WeibuUl

  6. EVALUATION OF SPRING OPERATED RELIEF VALVE MAINTENANCE INTERVALS AND EXTENSION OF MAINTENANCE TIMES USING A WEIBULL ANALYSIS WITH MODIFIED BAYESIAN UPDATING

    SciTech Connect

    Harris, S.; Gross, R.; Mitchell, E.

    2011-01-18

    The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

  7. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  8. Finite-size effects on return interval distributions for weakest-link-scaling systems

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Petrakis, Manolis P.; Kaniadakis, Giorgio

    2014-05-01

    The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the κ-Weibull distribution. The upper tail of the κ-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the κ-Weibull distribution decreases linearly after a waiting time τc∝n1/m, where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the κ Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the κ-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems.

  9. On Weibull's Spectrum of Non-relativistic Energetic Particles at IP Shocks: Observations and Theoretical Interpretation

    NASA Astrophysics Data System (ADS)

    Pallocchia, G.; Laurenza, M.; Consolini, G.

    2017-03-01

    Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was swept by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.

  10. Improved model based on the Weibull distribution to describe the combined effect of pH and temperature on the heat resistance of Bacillus cereus in carrot juice.

    PubMed

    Collado, J; Fernández, A; Cunha, L M; Ocio, M J; Martínez, A

    2003-06-01

    The effect of pH and temperature on the thermal inactivation of different strains of Bacillus cereus was modeled. Inactivation tests were carried out in carrot broth, following a full factorial design at four levels for temperature (from 90 to 105 degrees C, depending on the strain) and pH (6.2, 5.8, 5.2, and 4.7). Individual inactivation curves were analyzed by applying the Weibull model function (with percent discrepancy close to 20% for most cases), and the effects of pH and temperature on the scale parameter (designated D(beta)) and the shape parameter (beta) were also studied. Temperature and pH did not have a significant effect on the shape parameter (beta). The effect of temperature on the scale parameter was modeled by the zeta concept. The scale parameter decreased with pH, although the behavior of the strains was not homogeneous. Two global models with a small number of parameters were developed, providing a satisfactory description of the thermal inactivation of B. cereus, with percent discrepancy ranging from 18 to 25%.

  11. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed.

  12. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  13. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    PubMed

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively.

  14. An Extension to the Weibull Process Model

    DTIC Science & Technology

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  15. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Gyekenyesi, John P.

    1988-01-01

    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  16. Shallow Flaws Under Biaxial Loading Conditions, Part II: Application of a Weibull Stress Analysis of the Cruciform Bend Specimen Using a Hydrostatic Stress Criterion

    SciTech Connect

    Bass, B.R.; McAfee, W.J.; Williams, P.T.

    1999-08-01

    Cruciform beam fracture mechanics specimensl have been developed in the Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far- field, out-of-plane biaxird bending stress component in the test section that approximates the nonlinear biaxial stresses resulting from pressurized-thernxd-shock or pressure-temperature loading of a nuclear reactor pressure vessel (RPV). Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shtdlow, surface flaws. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. Two and three- parameter Weibull models have been calibrated using a new scheme (developed at the University of Illinois) that maps toughness data from test specimens with distinctly different levels of crack-tip constraint to a small scale yielding (SSY) Weibull stress space. These models, using the new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the OW integral definition, have been shown to correlate the experimentally observed biaxiaI effect in cruciform specimens, thereby providing a scaling mechanism between uniaxial and biaxial loading states.

  17. A Modified Cramer-von Mises and Anderson-Darling Test for the Weibull Distribution with Unknown Location and Scale Parameters.

    DTIC Science & Technology

    1981-12-01

    13 6 Plotting Positions Versus A2 or W2 Statistics. 22 7 Gamma Shape = 2 ..... ................ .... 24 8 a. Beta, p-l, q-l...Level-.20, n=20 . 71 13 Shape vs W2 Critical Values, Level-.20, n-25 . 72 14 Shape vs W2 Critical Values, Level-.20, n-30 . 73 15 Shape vs W2 Critical...formula to calculateW 2 is given by Eq (4). Letting x( 13 ,(2),...x(n ) be the n order statistics and letting Ui F (xi), the cumulative distribution

  18. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  19. Bias in the Weibull Strength Estimation of a SiC Fiber for the Small Gauge Length Case

    NASA Astrophysics Data System (ADS)

    Morimoto, Tetsuya; Nakagawa, Satoshi; Ogihara, Shinji

    It is known that the single-modal Weibull model describes well the size effect of brittle fiber tensile strength. However, some ceramic fibers have been reported that single-modal Weibull model provided biased estimation on the gauge length dependence. A hypothesis on the bias is that the density of critical defects is very small, thus, fracture probability of small gauge length samples distributes in discrete manner, which makes the Weibull parameters dependent on the gauge length. Tyranno ZMI Si-Zr-C-O fiber has been selected as an example fiber. The tensile tests have been done on several gauge lengths. The derived Weibull parameters have shown a dependence on the gauge length. Fracture surfaces were observed with SEM. Then we classified the fracture surfaces into the characteristic fracture patterns. Percentage of each fracture pattern was found dependent on the gauge length, too. This may be an important factor of the Weibull parameter dependence on the gauge length.

  20. Transmission overhaul and replacement predictions using Weibull and renewal theory

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1989-01-01

    A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

  1. Transmission overhaul and replacement predictions using Weibull and renewel theory

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1989-01-01

    A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

  2. Outage performance of multihop free-space optical communication system over exponentiated Weibull fading channels with nonzero boresight pointing errors

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-xia; Wang, Ping; Cao, Tian

    2016-09-01

    The outage performance of the multihop free-space optical (FSO) communication system with decode-and-forward (DF) protocol is studied by considering the joint effects of nonzero boresight pointing errors and atmospheric turbulence modeled by exponentiated Weibull (EW) distribution. The closed-form analytical expression of outage probability is derived, and the results are validated through Monte Carlo simulation. Furthermore, the detailed analysis is provided to evaluate the impacts of turbulence strength, receiver aperture size, boresight displacement, beamwidth and number of relays on the outage performance for the studied system.

  3. Regional flood quantile estimation for a Weibull Model

    NASA Astrophysics Data System (ADS)

    Boes, Duane C.; Heo, Jun-Haeng; Salas, Jose D.

    1989-05-01

    Estimation of annual flood quantiles at a given site, based on a regional Weibull model with independence in space and time, is considered. A common shape parameter over sites, motivated by an index flood assumption, was assumed. An exact simple formula for the Cramer-Rao lower bound for the variance of unbiased estimators of the quantile is obtained, and the gain of regional flood frequency analysis over single-site analysis can be quantified via this formula. The estimation techniques of the method of moments, the method of probability-weighted moments, and the method of maximum likelihood are compared.

  4. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  5. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    SciTech Connect

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.

  6. Distributed analysis at LHCb

    NASA Astrophysics Data System (ADS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart; LHCb Collaboration

    2011-12-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  7. Evaluation of prognostic factors effect on survival time in patients with colorectal cancer, based on Weibull Competing-Risks Model

    PubMed Central

    Moamer, Soraya; Baghestani, Ahmadreza; Pourhoseingholi, Mohamad Amin; Hajizadeh, Nastaran; Ahmadi, Farzaneh; Norouzinia, Mohsen

    2017-01-01

    Aim: The aim of this study was to assess the association between survival of patients with colorectal cancer and prognostic factors in a competing risk parametric model using Weibull distribution. Background: The prognosis of colorectal cancer is relatively good in terms of survival time. In many prognostic studies, patients may be exposed to several types of competing events. These different causes of death are called competing risks. Methods: Data was recorded from 372 patients with colorectal cancer who registered in the Institute for Gastroenterology and Liver Diseases, Shahid Beheshti University of Medical Sciences (Tehran, Iran) from 2004 to 2015 in a retrospective study. Analysis was performed using competing risks model and Weibull distribution. Software used for data analysis was R, and significance level was regarded as 0.05. Results: The result indicated that, at the end of follow-up, 111 (29.8%) deaths were from colorectal cancer and 14 (3.8%) deaths were due to other diseases. The average body mass index (BMI) was 24.61(SD 3.98). The mean survival time for a patient in 372 was 62.05(SD 48.78) month with median equals to 48 months. According to competing-risks method, only stageIII (HR, 1.69; 95% CI, 1.246-2.315 ), stageIV( HR, 4.51; 95% CI,2.91-6.99 ) and BMI( HR, 0.96; 95% CI, 0.96-0.975) have a significant effect on patient’s survival time. Conclusion: This study indicated pathologic stage (III,IV) and BMI as the prognosis, using a Weibull model with competing risks analysis, while other models without the competing events lead to significant predictors which may be due to over-estimation.

  8. Distributed analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  9. Free-space communications over exponentiated Weibull turbulence channels with nonzero boresight pointing errors.

    PubMed

    Yi, Xiang; Yao, Mingwu

    2015-02-09

    In this paper, we present analytical expressions for the performance of urban free-space optical (FSO) communication systems under the combined influence of atmospheric turbulence- and misalignment-induced fading (pointing errors). The atmospheric turbulence channel is modeled by the exponentiated Weibull (EW) distribution that can accurately describe the probability density function (PDF) of the irradiance fluctuations associated with a transmitted Gaussian-beam wave and a finite-sized receiving aperture. The nonzero boresight pointing error PDF model, which is recently proposed for considering the effects of both boresight and jitter, is adopted in analysis. We derive a novel expression for the composite PDF in terms of a convergent double series involving a Meijer's G-function. Based on the statistical results mentioned above, exact expressions for the average bit error rate of on-off keying modulation scheme and the outage probability are developed. To provide more insight, we also perform an asymptotic error rate analysis at high average signal-to-noise ratio. Our analytical results indicate that the diversity gain for the zero boresight case is determined only by the ratio between the equivalent beamwidth at the receiver and the jitter standard deviation, while for the nonzero boresight case, the diversity gain is related to the ratio of the equivalent beamwidth to the jitter variance as well as the parameter of the EW distribution.

  10. Distributions of personal VOC exposures: a population-based analysis.

    PubMed

    Jia, Chunrong; D'Souza, Jennifer; Batterman, Stuart

    2008-10-01

    Information regarding the distribution of volatile organic compound (VOC) concentrations and exposures is scarce, and there have been few, if any, studies using population-based samples from which representative estimates can be derived. This study characterizes distributions of personal exposures to ten different VOCs in the U.S. measured in the 1999--2000 National Health and Nutrition Examination Survey (NHANES). Personal VOC exposures were collected for 669 individuals over 2-3 days, and measurements were weighted to derive national-level statistics. Four common exposure sources were identified using factor analyses: gasoline vapor and vehicle exhaust, methyl tert-butyl ether (MBTE) as a gasoline additive, tap water disinfection products, and household cleaning products. Benzene, toluene, ethyl benzene, xylenes chloroform, and tetrachloroethene were fit to log-normal distributions with reasonably good agreement to observations. 1,4-Dichlorobenzene and trichloroethene were fit to Pareto distributions, and MTBE to Weibull distribution, but agreement was poor. However, distributions that attempt to match all of the VOC exposure data can lead to incorrect conclusions regarding the level and frequency of the higher exposures. Maximum Gumbel distributions gave generally good fits to extrema, however, they could not fully represent the highest exposures of the NHANES measurements. The analysis suggests that complete models for the distribution of VOC exposures require an approach that combines standard and extreme value distributions, and that carefully identifies outliers. This is the first study to provide national-level and representative statistics regarding the VOC exposures, and its results have important implications for risk assessment and probabilistic analyses.

  11. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  12. The distribution of first-passage times and durations in FOREX and future markets

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  13. An EWMA chart for sample range of Weibull data using weighted variance method

    NASA Astrophysics Data System (ADS)

    Atta, Abdu. M. A.; Yahaya, Sharipah Soaad Syed; Zain, Zakiyah; Ali, Hazlina

    2016-10-01

    This article proposes new EWMA chart in observing process standard deviation or dispersion with sample range of Weibull data using weighted variance method (WV). This control chart, called Weighted Variance EWMA sample range WV-EWMASR chart hereafter. The proposed WV-EWMASR chart compared with standard EWMASR of [7], skewness correction R chart (SC-R) suggested by[3]and Weighted Variance R chart (WV-R) proposed by [2], in the case of Type I and Type II errors when the data generated from Weibull distribution. Optimal parameters λ and k of the proposed WV-EWMASR and standard EWMASR are obtained via simulation using SAS program 9.4. The proposed WV-EWMASR control chart reduces to the standard EWMASR control chart of [7] when the process follow symmetric distribution. The proposed WV-EWMASR control chart has less Type I error than the standard EWMASR, SC-R and WV-R control charts, for Weibull distribution data. In case of Type II error, the proposed WV-EWMASR control chart is closer to EWMA chart with the exact limits than the standard EWMASR in [7].

  14. Biological implications of the Weibull and Gompertz models of aging.

    PubMed

    Ricklefs, Robert E; Scheuerlein, Alex

    2002-02-01

    Gompertz and Weibull functions imply contrasting biological causes of demographic aging. The terms describing increasing mortality with age are multiplicative and additive, respectively, which could result from an increase in the vulnerability of individuals to extrinsic causes in the Gompertz model and the predominance of intrinsic causes at older ages in the Weibull model. Experiments that manipulate extrinsic mortality can distinguish these biological models. To facilitate analyses of experimental data, we defined a single index for the rate of aging (omega) for the Weibull and Gompertz functions. Each function described the increase in aging-related mortality in simulated ages at death reasonably well. However, in contrast to the Weibull omega(W), the Gompertz omega(G) was sensitive to variation in the initial mortality rate independently of aging-related mortality. Comparisons between wild and captive populations appear to support the intrinsic-causes model for birds, but give mixed support for both models in mammals.

  15. Lifetime assessment by intermittent inspection under the mixture Weibull power law model with application to XLPE cables.

    PubMed

    Hirose, H

    1997-01-01

    This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.

  16. Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1991-01-01

    Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

  17. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  18. Incorporating finite element analysis into component life and reliability

    NASA Technical Reports Server (NTRS)

    August, Richard; Zaretsky, Erwin V.

    1991-01-01

    A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

  19. On the q-type distributions

    NASA Astrophysics Data System (ADS)

    Nadarajah, Saralees; Kotz, Samuel

    2007-04-01

    Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236

  20. Weibull Statistics for Upper Ocean Currents with the Fokker-Planck Equation

    NASA Astrophysics Data System (ADS)

    Chu, P. C.

    2012-12-01

    Upper oceans typically exhibit of a surface mixed layer with a thickness of a few to several hundred meters. This mixed layer is a key component in studies of climate, biological productivity and marine pollution. It is the link between the atmosphere and the deep ocean and directly affects the air-sea exchange of heat, momentum and gases. Vertically averaged horizontal currents across the mixed layer are driven by the residual between the Ekman transport and surface wind stress, and damped by the Rayleigh friction. A set of stochastic differential equations are derived for the two components of the current vector (u, v). The joint probability distribution function of (u, v) satisfies the Fokker-Planck equation (Chu, 2008, 2009), with the Weibull distribution as the solution for the current speed. To prove it, the PDF of the upper (0-50 m) tropical Pacific current speeds (w) was calculated from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project. In fact, it satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies. Besides, the Weibull distribution is also identified from the

  1. The effect of wall thickness distribution on mechanical reliability and strength in unidirectional porous ceramics.

    PubMed

    Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J

    2016-01-01

    Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).

  2. The effect of wall thickness distribution on mechanical reliability and strength in unidirectional porous ceramics

    PubMed Central

    Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.

    2016-01-01

    Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (m=13.2) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 μm) and lower pore volume (54.5%). PMID:27877864

  3. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  4. Spatial and Temporal Patterns of Global Onshore Wind Speed Distribution

    SciTech Connect

    Zhou, Yuyu; Smith, Steven J.

    2013-09-09

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/CFSR reanalysis data. The estimated Weibull distribution performs well in fitting the time series wind speed data at the global level according to R2, root mean square error, and power density error. The spatial, decadal, and seasonal patterns of wind speed distribution were then evaluated. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in substantial errors. While large-scale wind speed data is often presented in the form of average wind speeds, these results highlight the need to also provide information on the wind speed distribution.

  5. Distribution-free discriminant analysis

    SciTech Connect

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  6. Weibull Effective Area for Hertzian Ring Crack Initiation Stress

    SciTech Connect

    Jadaan, Osama M.; Wereszczak, Andrew A; Johanns, Kurt E

    2011-01-01

    Spherical or Hertzian indentation is used to characterize and guide the development of engineered ceramics under consideration for diverse applications involving contact, wear, rolling fatigue, and impact. Ring crack initiation can be one important damage mechanism of Hertzian indentation. It is caused by sufficiently-high, surface-located, radial tensile stresses in an annular ring located adjacent to and outside of the Hertzian contact circle. While the maximum radial tensile stress is known to be dependent on the elastic properties of the sphere and target, the diameter of the sphere, the applied compressive force, and the coefficient of friction, the Weibull effective area too will be affected by those parameters. However, the estimations of a maximum radial tensile stress and Weibull effective area are difficult to obtain because the coefficient of friction during Hertzian indentation is complex, likely intractable, and not known a priori. Circumventing this, the Weibull effective area expressions are derived here for the two extremes that bracket all coefficients of friction; namely, (1) the classical, frictionless, Hertzian case where only complete slip occurs, and (2) the case where no slip occurs or where the coefficient of friction is infinite.

  7. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  8. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2017-02-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  9. Distribution analysis of airborne nicotine concentrations in hospitality facilities.

    PubMed

    Schorp, Matthias K; Leyden, Donald E

    2002-02-01

    A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector.

  10. Statistical modeling of tornado intensity distributions

    NASA Astrophysics Data System (ADS)

    Dotzek, Nikolai; Grieser, Jürgen; Brooks, Harold E.

    We address the issue to determine an appropriate general functional shape of observed tornado intensity distributions. Recently, it was suggested that in the limit of long and large tornado records, exponential distributions over all positive Fujita or TORRO scale classes would result. Yet, our analysis shows that even for large databases observations contradict the validity of exponential distributions for weak (F0) and violent (F5) tornadoes. We show that observed tornado intensities can be much better described by Weibull distributions, for which an exponential remains a special case. Weibull fits in either v or F scale reproduce the observations significantly better than exponentials. In addition, we suggest to apply the original definition of negative intensity scales down to F-2 and T-4 (corresponding to v=0 m s -1) at least for climatological analyses. Weibull distributions allow for an improved risk assessment of violent tornadoes up to F6, and better estimates of total tornado occurrence, degree of underreporting and existence of subcritical tornadic circulations below damaging intensity. Therefore, our results are relevant for climatologists and risk assessment managers alike.

  11. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  12. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2008-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  13. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  14. Assessing distributions for monthly mean wind speed data

    NASA Astrophysics Data System (ADS)

    Kamil, Mira Syahirah; Razali, Ahmad Mahir

    2016-11-01

    Analysis of the wind speed behavior will contribute the vital information for the wind energy potential and its development. Hence, this study focuses on fitting several distributions to determine the most appropriate probability distribution that will describe the wind pattern in Kuala Terengganu and Mersing. Four different statistical distributions have been fitted to the monthly mean wind speed from eight different directions. Two stations of Kuala Terengganu and Mersing have been observed for the period 2000 to 2008. These distributions were tested using Kolmogorov-Smirnov statistic to find the best fit for describing the observed data. The Weibull distribution shows a clear fit for all wind speed directions in both locations.

  15. Expectation maximization-based likelihood inference for flexible cure rate models with Weibull lifetimes.

    PubMed

    Balakrishnan, Narayanaswamy; Pal, Suvra

    2016-08-01

    Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence.

  16. Parametric regression model for survival data: Weibull regression model as an example

    PubMed Central

    2016-01-01

    Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846

  17. Towards Distributed Memory Parallel Program Analysis

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2008-06-17

    This paper presents a parallel attribute evaluation for distributed memory parallel computer architectures where previously only shared memory parallel support for this technique has been developed. Attribute evaluation is a part of how attribute grammars are used for program analysis within modern compilers. Within this work, we have extended ROSE, a open compiler infrastructure, with a distributed memory parallel attribute evaluation mechanism to support user defined global program analysis required for some forms of security analysis which can not be addressed by a file by file view of large scale applications. As a result, user defined security analyses may now run in parallel without the user having to specify the way data is communicated between processors. The automation of communication enables an extensible open-source parallel program analysis infrastructure.

  18. CRAB: Distributed analysis tool for CMS

    NASA Astrophysics Data System (ADS)

    Sala, Leonardo; CMS Collaboration

    2012-12-01

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

  19. The Weibull functional form for the energetic particle spectrum at interplanetary shock waves

    NASA Astrophysics Data System (ADS)

    Laurenza, M.; Consolini, G.; Storini, M.; Pallocchia, G.; Damiani, A.

    2016-11-01

    Transient interplanetary shock waves are often associated with high energy particle enhancements, which are called energetic storm particle (ESP) events. Here we present a case study of an ESP event, recorded by the SEPT, LET and HET instruments onboard the STEREO B spacecraft, on 3 October 2011, in a wide energy range from 0.1 MeV to ∼ 30 MeV. The obtained particle spectrum is found to be reproduced by a Weibull like shape. Moreover, we show that the Weibull spectrum can be theoretically derived as the asymptotic steady state solution of the diffusion loss equation by assuming anomalous diffusion for particle velocity. The evaluation of Weibull's parameters obtained from particle observations and the power spectral density of the turbulent fluctations in the shock region, support this scenario and suggest that stochastic acceleration can contribute significantly to the acceleration of high energetic particles at collisioness shock waves.

  20. Weibull approximation of LiDAR waveforms for estimating the beam attenuation coefficient.

    PubMed

    Montes-Hugo, Martin A; Vuorenkoski, Anni K; Dalgleish, Fraser R; Ouyang, Bing

    2016-10-03

    Tank experiments were performed at different water turbidities to examine relationships between the beam attenuation coefficient (c) and Weibull shape parameters derived from LiDAR waveforms measured with the Fine Structure Underwater LiDAR (FSUIL). Optical inversions were made at 532 nm, within a c range of 0.045-1.52 m-1, and based on a LiDAR system having two field-of-view (15 and 75.7 mrad) and two linear polarizations. Consistently, the Weibull scale parameter or P2 showed the strongest covariation with c and was a more accurate proxy with respect to the LiDAR attenuation coefficient.

  1. Complexity Analysis of Peat Soil Density Distribution

    NASA Astrophysics Data System (ADS)

    Sampurno, Joko; Diah Faryuni, Irfana; Dzar Eljabbar Latief, Fourier; Srigutomo, Wahyu

    2016-08-01

    The distributions of peat soil density have been identified using fractal analysis method. The study was conducted on 5 peat soil samples taken from a ground field in Pontianak, West Kalimantan, at the coordinates (0 ° 4 '2:27 "S, 109 ° 18' 48.59" E). In this study, we used micro computerized tomography (pCT Scanner) at 9.41 micro meter per pixel resolution under peat soil samples to provide 2-D high-resolution images L1-L5 (200 200 pixels) that were used to detect the distribution of peat soil density. The method for determining the fractal dimension and intercept was the 2-D Fourier analysis method. The method was used to obtain the log log-plot of magnitude with frequency. Fractal dimension was obtained from the straight regression line that interpolated the points in the interval with the largest coefficient determination. Intercept defined by the point of intersection on the -axis. The conclusion was that the distributions of peat soil density showing the fractal behaviour with the heterogeneity of the samples from the highest to the lowest were L5, L1, L4, L3 and L2. Meanwhile, the range of density values of the samples from the highest to the lowest was L3, L2, L4, L5 and L1. The study also concluded that the behaviour of the distribution of peat soil density was a weakly anisotropic.

  2. Analysis and Modelling of Extreme Wind Speed Distributions in Complex Mountainous Regions

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Kanevski, Mikhail

    2016-04-01

    Modelling of wind speed distributions in complex mountainous regions is an important and challenging problem which interests many scientists from several fields. In the present research, high frequency (10 min) Swiss wind speed monitoring data (IDAWEB service, Meteosuisse) are analysed and modelled with different parametric distributions (Weibull, GEV, Gamma, etc.) using maximum likelihood method. In total, 111 stations placed in different geomorphological units and at different altitude (from 203 to 3580 meters) are studied. Then, this information is used for training machine learning algorithms (Extreme Learning Machines, Support vector machine) to predict the distribution at new places, potentially useful for aeolian energy generation. An important part of the research deals with the construction and application of a high dimensional input feature space, generated from digital elevation model. A comprehensive study was carried out using feature selection approach to get the best model for the prediction. The main results are presented as spatial patterns of distributions' parameters.

  3. Distributed analysis in ATLAS using GANGA

    NASA Astrophysics Data System (ADS)

    Elmsheuser, Johannes; Brochu, Frederic; Cowan, Greig; Egede, Ulrik; Gaidioz, Benjamin; Lee, Hurng-Chun; Maier, Andrew; Móscicki, Jakub; Pajchel, Katarina; Reece, Will; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Vanderster, Daniel; Williams, Michael

    2010-04-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  4. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  5. Analysis of Jingdong Mall Logistics Distribution Model

    NASA Astrophysics Data System (ADS)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  6. Analysis and control of distributed cooperative systems.

    SciTech Connect

    Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

    2004-09-01

    As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

  7. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  8. CMS distributed data analysis with CRAB3

    NASA Astrophysics Data System (ADS)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  9. CMS distributed data analysis with CRAB3

    SciTech Connect

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  10. CMS distributed data analysis with CRAB3

    DOE PAGES

    Mascheroni, M.; Balcas, J.; Belforte, S.; ...

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less

  11. Further Development of New Methods for Estimating Tail Probabilities and Extreme Value Distributions.

    DTIC Science & Technology

    1981-01-01

    confidence interval procedures were studied, both analytically and by means of an extensive Monte Carlo experiment. The experiment involved three sampled sizes (100, 200, 400) and twenty underlying distributions (five Weibulls, five mixed Weibulls, five lognormals, five mixed lognormals). The Monte Carlo results show that all three procedures studied work quite well and point the way to further improvement.

  12. Noise analysis in power distribution systems

    NASA Astrophysics Data System (ADS)

    Danisor, Alin

    2016-12-01

    This paper proposes an analysis, especially in time domain, of the electrical noise existent on the power distribution lines. This study is important for the use of powerlines as a channel of information transmissions. This information may refer to analog signals and as well to digital signals. The main problem addressed in this paper consists in the characterization of the background noise and to establish his statistical proprieties. It is very important to know if the noise induced in the transmission channel is a stationary one, or even an ergodic one. The main parameters like the mean value, the mean square value were determined in this paper. The approximation of the probability density function of each statistical parameter was studied. The pulses induced in the transmission channel by the transient phenomena of the power electrical systems were considered deterministic signals and their contributions were not included in this study.

  13. On the gap between an empirical distribution and an exponential distribution of waiting times for price changes in a financial market

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya

    2007-03-01

    We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.

  14. A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes

    PubMed Central

    2015-01-01

    Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222

  15. Global resilience analysis of water distribution systems.

    PubMed

    Diao, Kegong; Sweetapple, Chris; Farmani, Raziyeh; Fu, Guangtao; Ward, Sarah; Butler, David

    2016-12-01

    Evaluating and enhancing resilience in water infrastructure is a crucial step towards more sustainable urban water management. As a prerequisite to enhancing resilience, a detailed understanding is required of the inherent resilience of the underlying system. Differing from traditional risk analysis, here we propose a global resilience analysis (GRA) approach that shifts the objective from analysing multiple and unknown threats to analysing the more identifiable and measurable system responses to extreme conditions, i.e. potential failure modes. GRA aims to evaluate a system's resilience to a possible failure mode regardless of the causal threat(s) (known or unknown, external or internal). The method is applied to test the resilience of four water distribution systems (WDSs) with various features to three typical failure modes (pipe failure, excess demand, and substance intrusion). The study reveals GRA provides an overview of a water system's resilience to various failure modes. For each failure mode, it identifies the range of corresponding failure impacts and reveals extreme scenarios (e.g. the complete loss of water supply with only 5% pipe failure, or still meeting 80% of demand despite over 70% of pipes failing). GRA also reveals that increased resilience to one failure mode may decrease resilience to another and increasing system capacity may delay the system's recovery in some situations. It is also shown that selecting an appropriate level of detail for hydraulic models is of great importance in resilience analysis. The method can be used as a comprehensive diagnostic framework to evaluate a range of interventions for improving system resilience in future studies.

  16. Likelihood analysis of earthquake focal mechanism distributions

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2015-06-01

    In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad hoc, empirical assumptions, thus their performance is questionable. We apply a conventional likelihood method to measure the skill of earthquake focal mechanism orientation forecasts. The advantage of such an approach is that earthquake rate prediction can be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. We measure the difference between two double-couple sources as the minimum rotation angle that transforms one into the other. We measure the uncertainty of a focal mechanism forecast (the variability), and the difference between observed and forecasted orientations (the prediction error), in terms of these minimum rotation angles. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random (or equally probable). For 3-D rotation the random rotation angle distribution is not uniform. To better understand the resulting complexities, we calculate the information (likelihood) score for two theoretical rotational distributions (Cauchy and von Mises-Fisher), which are used to approximate earthquake source orientation pattern. We then calculate the likelihood score for earthquake source forecasts and for their validation by future seismicity data. Several issues need to be explored when analyzing observational results: their dependence on forecast and data resolution, internal dependence of scores on forecasted angle and random variability of likelihood scores. Here, we propose a simple tentative solution but extensive theoretical and statistical analysis is needed.

  17. Sensitivity analysis of distributed volcanic source inversion

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  18. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  19. Survival Analysis of Patients with End Stage Renal Disease

    NASA Astrophysics Data System (ADS)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  20. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the Dist

  1. A Wigner Distribution Analysis of Scattering Dynamics

    NASA Astrophysics Data System (ADS)

    Weeks, David; Lacy, Brent

    2009-04-01

    Using the time dependent Channel Packet Method (CPM),ootnotetextD.E.Weeks, T.A.Niday, S.H.Yang, J Chem Phys. 125, 164301 (2006). a Fourier transformation of the correlation function between evolving wave packets is used to compute scattering matrix elements. The correlation function can also be used to compute a Wigner distribution as a function of time and energy. This scattering Wigner distribution is then used to investigate times at which various energetic contributions to the scattering matrix are made during a molecular collision. We compute scattering Wigner distributions for a variety of molecular systems and use them to characterize the associated molecular dynamics. In particular, the square well provides a simple and easily modified potential to study the relationship between the scattering Wigner distribution and wave packet dynamics. Additional systems that are being studied include the collinear H + H2 molecular reaction, and the non-adiabatic B + H2 molecular collision.

  2. Introduction to Pair Distribution Function Analysis

    SciTech Connect

    King, Graham Missell

    2015-02-17

    By collecting a total scattering pattern, subtracting the non-sample background, applying corrections, and taking the Fourier transform, the real space pair distribution function can be obtained. A PDF gives the distribution of inter-atomic distances in a material and is an excellent probe of short and intermediate range structure. RMC refinements using multiple data types are an excellent method for multi-scale modeling, including the mesoscale range.

  3. A Weibull-PBPK model for assessing risk of arsenic-induced skin lesions in children.

    PubMed

    Liao, Chung-Min; Lin, Tzu-Ling; Chen, Szu-Chieh

    2008-03-25

    Chronic arsenic exposure and skin lesions (keratosis and hyperpigmentation) are inextricably linked. This paper was to quantify the children skin lesions risks and to further recommend safe drinking water arsenic standard based on reported arsenic epidemiological data. We linked the Weibull dose-response function and a physiologically based pharmacokinetic (PBPK) model to estimate safe drinking water arsenic concentrations and to perform the risk characterization. We calculated odds ratios (ORs) to assess the relative magnitude of the effect of the arsenic exposure on the likelihood of the prevalence of children skin lesions by calculating proposed Weibull-based prevalence ratios of exposed to control groups associated with the age group-specific PBPK model predicted dimethylarsinite (MMA(III)) levels in urine. Positive relationships between arsenic exposures and cumulative prevalence ratios of skin lesions were found using Weibull dose-response model (r2=0.91-0.96). We reported that the safe drinking water arsenic standards were recommended to be 2.2 and 1 microg/L for male and 6 and 2.8 microg/L for female in 0-6 and 7-18 years age groups, respectively, based on hyperpigmentation with an excess risk of 10(-3) for a 75 years lifetime exposure. Risk predictions indicate that estimated ORs have 95% confidence intervals of 1.33-5.12, 1.74-19.15, and 2.81-19.27 based on mean drinking water arsenic contents of 283.19, 282.65, and 468.81 microg/L, respectively, in West Bengal, India, Bangladesh, and southwestern Taiwan. Our findings also suggest that increasing urinary monomethylarsonic acid (MMA) levels are associated with an increase in risks of arsenic-induced children skin lesions.

  4. Analysis of Temperature Distributions in Nighttime Inversions

    NASA Astrophysics Data System (ADS)

    Telyak, Oksana; Krasouski, Aliaksandr; Svetashev, Alexander; Turishev, Leonid; Barodka, Siarhei

    2015-04-01

    Adequate prediction of temperature inversion in the atmospheric boundary layer is one of prerequisites for successful forecasting of meteorological parameters and severe weather events. Examples include surface air temperature and precipitation forecasting as well as prediction of fog, frosts and smog with hazardous levels of atmospheric pollution. At the same time, reliable forecasting of temperature inversions remains an unsolved problem. For prediction of nighttime inversions over some specific territory, it is important to study characteristic features of local circulation cells formation and to properly take local factors into account to develop custom modeling techniques for operational use. The present study aims to investigate and analyze vertical temperature distributions in tropospheric inversions (isotherms) over the territory of Belarus. We study several specific cases of formation, evolution and decay of deep nighttime temperature inversions in Belarus by means of mesoscale numerical simulations with WRF model, considering basic mechanisms of isothermal and inverse temperature layers formation in the troposphere and impact of these layers on local circulation cells. Our primary goal is to assess the feasibility of advance prediction of inversions formation with WRF. Modeling results reveal that all cases under consideration have characteristic features of radiative inversions (e.g., their formation times, development phases, inversion intensities, etc). Regions of "blocking" layers formation are extensive and often spread over the entire territory of Belarus. Inversions decay starts from the lowermost (near surface) layer (altitudes of 5 to 50 m). In all cases, one can observe formation of temperature gradients that substantially differ from the basic inversion gradient, i.e. the layer splits into smaller layers, each having a different temperature stratification (isothermal, adiabatic, etc). As opposed to various empirical techniques as well as

  5. Analysis of the chaotic maps generating different statistical distributions

    NASA Astrophysics Data System (ADS)

    Lawnik, M.

    2015-09-01

    The analysis of the chaotic maps, enabling the derivation of numbers from given statistical distributions was presented. The analyzed chaotic maps are in the form xk+1 = F-1(U(F(xk))), where F is the cumulative distribution function, U is the skew tent map and F-1 is the inverse function of F. The analysis was presented on the example of chaotic map with the standard normal distribution in view of his computational efficiency and accuracy. On the grounds of the conducted analysis, it should be indicated that the method not always allows to generate the values from the given distribution.

  6. Development of an Analysis System for Low Voltage Distribution System

    NASA Astrophysics Data System (ADS)

    Matsuda, Katsuhiro; Wada, Masaru; Hirano, Shinichiro; Hirai, Yoshihiro; Tsuboe, Yasuhiro; Watanabe, Masahiro; Furukawa, Toshiyuki

    In recent years, distributed resources such as photovoltaic power generation system or wind-turbine generator system are increased, therefore the distributed resources which connect to distribution networks are increased gradually. Under the situation there are several problems such as expansion of the voltage fluctuation, increase of the short circuit current, increase of harmonics phenomenon which we have to consider, and the problems make us difficult to examine the effect of interconnection and to design the distribution system. However, analysis support system to evaluate the influence to connect distributed resources to low voltage distribution system has not developed. Therefore We have developed the analysis system for low voltage for low voltage distribution systems. We can evaluate the influence of distributed resources accurately, examine the interconnection and design the configuration of distribution networks by using the analysis system. In this paper, the concept of the analysis system, the load flow method for unbalanced V-connection 3-phase 4-line distribution system and the calculation method for the connectable capacity of distributed resources. Outline of the man/machine interface and examples of calculation results for sample network are also described.

  7. Harmonic analysis of electrical distribution systems

    SciTech Connect

    1996-03-01

    This report presents data pertaining to research on harmonics of electric power distribution systems. Harmonic data is presented on RMS and average measurements for determination of harmonics in buildings; fluorescent ballast; variable frequency drive; georator geosine harmonic data; uninterruptible power supply; delta-wye transformer; westinghouse suresine; liebert datawave; and active injection mode filter data.

  8. A mixture Weibull proportional hazard model for mechanical system failure prediction utilising lifetime and monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Qing; Hua, Cheng; Xu, Guanghua

    2014-02-01

    As mechanical systems increase in complexity, it is becoming more and more common to observe multiple failure modes. The system failure can be regarded as the result of interaction and competition between different failure modes. It is therefore necessary to combine multiple failure modes when analysing the failure of an overall system. In this paper, a mixture Weibull proportional hazard model (MWPHM) is proposed to predict the failure of a mechanical system with multiple failure modes. The mixed model parameters are estimated by combining historical lifetime and monitoring data of all failure modes. In addition, the system failure probability density is obtained by proportionally mixing the failure probability density of multiple failure modes. Monitoring data are input into the MWPHM to estimate the system reliability and predict the system failure time. A simulated sample set is used to verify the ability of the MWPHM to model multiple failure modes. Finally, the MWPHM and the traditional Weibull proportional hazard model (WPHM) are applied to a high-pressure water descaling pump, which has two failure modes: sealing ring wear and thrust bearing damage. Results show that the MWPHM is greatly superior in system failure prediction to the WPHM.

  9. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    SciTech Connect

    Sun, Huarui Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin

    2015-01-26

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.

  10. Integer sparse distributed memory: analysis and results.

    PubMed

    Snaider, Javier; Franklin, Stan; Strain, Steve; George, E Olusegun

    2013-10-01

    Sparse distributed memory is an auto-associative memory system that stores high dimensional Boolean vectors. Here we present an extension of the original SDM, the Integer SDM that uses modular arithmetic integer vectors rather than binary vectors. This extension preserves many of the desirable properties of the original SDM: auto-associativity, content addressability, distributed storage, and robustness over noisy inputs. In addition, it improves the representation capabilities of the memory and is more robust over normalization. It can also be extended to support forgetting and reliable sequence storage. We performed several simulations that test the noise robustness property and capacity of the memory. Theoretical analyses of the memory's fidelity and capacity are also presented.

  11. Economic analysis of efficient distribution transformer trends

    SciTech Connect

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  12. Distributional Analysis of Inventory Demand over Leadtime.

    DTIC Science & Technology

    1982-06-01

    is the sample standard deviation of the natural log of all non-zero demand observations. An alternate method for estimating the mean and stan- dard...deviation of the lognormal distribution is to use the method of moments. The procedure in effect, transforms the mean and standard deviation obtained...from the untransformed sample data vice transforming the data first and then computing the sample mean and standard deviation. This method involves

  13. Microbubble Size Distributions Data Collection and Analysis

    DTIC Science & Technology

    2016-06-13

    Blank TM 841204 INTRODUCTION Properties of micron-sized bubble aggregates in sea water were investigated to determine their influence on the...problem during this study. This paper will discuss bubble size and size distribution measurements in sea water while underway. A technique to detect...plugged in. The internal gear mechanism cycles the strobe and film advance approximately every 5 seconds. The camera continually sampled until the

  14. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  15. Intensity distribution analysis of cathodoluminescence using the energy loss distribution of electrons.

    PubMed

    Fukuta, Masahiro; Inami, Wataru; Ono, Atsushi; Kawata, Yoshimasa

    2016-01-01

    We present an intensity distribution analysis of cathodoluminescence (CL) excited with a focused electron beam in a luminescent thin film. The energy loss distribution is applied to the developed analysis method in order to determine the arrangement of the dipole locations along the path of the electron traveling in the film. Propagating light emitted from each dipole is analyzed with the finite-difference time-domain (FDTD) method. CL distribution near the film surface is evaluated as a nanometric light source. It is found that a light source with 30 nm widths is generated in the film by the focused electron beam. We also discuss the accuracy of the developed analysis method by comparison with experimental results. The analysis results are brought into good agreement with the experimental results by introducing the energy loss distribution.

  16. Simulation of probability distributions commonly used in hydrological frequency analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Ke-Sheng; Chiang, Jie-Lun; Hsu, Chieh-Wei

    2007-01-01

    Random variable simulation has been applied to many applications in hydrological modelling, flood risk analysis, environmental impact assessment, etc. However, computer codes for simulation of distributions commonly used in hydrological frequency analysis are not available in most software libraries. This paper presents a frequency-factor-based method for random number generation of five distributions (normal, log-normal, extreme-value type I, Pearson type III and log-Pearson type III) commonly used in hydrological frequency analysis. The proposed method is shown to produce random numbers of desired distributions through three means of validation: (1) graphical comparison of cumulative distribution functions (CDFs) and empirical CDFs derived from generated data; (2) properties of estimated parameters; (3) type I error of goodness-of-fit test. An advantage of the method is that it does not require CDF inversion, and frequency factors of the five commonly used distributions involves only the standard normal deviate. Copyright

  17. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  18. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  19. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  20. Effect of Porosity on Strength Distribution of Microcrystalline Cellulose.

    PubMed

    Keleṣ, Özgür; Barcenas, Nicholas P; Sprys, Daniel H; Bowman, Keith J

    2015-12-01

    Fracture strength of pharmaceutical compacts varies even for nominally identical samples, which directly affects compaction, comminution, and tablet dosage forms. However, the relationships between porosity and mechanical behavior of compacts are not clear. Here, the effects of porosity on fracture strength and fracture statistics of microcrystalline cellulose compacts were investigated through diametral compression tests. Weibull modulus, a key parameter in Weibull statistics, was observed to decrease with increasing porosity from 17 to 56 vol.%, based on eight sets of compacts at different porosity levels, each set containing ∼ 50 samples, a total of 407 tests. Normal distribution fits better to fracture data for porosity less than 20 vol.%, whereas Weibull distribution is a better fit in the limit of highest porosity. Weibull moduli from 840 unique finite element simulations of isotropic porous materials were compared to experimental Weibull moduli from this research and results on various pharmaceutical materials. Deviations from Weibull statistics are observed. The effect of porosity on fracture strength can be described by a recently proposed micromechanics-based formula.

  1. Modeling and analysis of solar distributed generation

    NASA Astrophysics Data System (ADS)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  2. Adaptive walks and distribution of beneficial fitness effects.

    PubMed

    Seetharaman, Sarada; Jain, Kavita

    2014-04-01

    We study the adaptation dynamics of a maladapted asexual population on rugged fitness landscapes with many local fitness peaks. The distribution of beneficial fitness effects is assumed to belong to one of the three extreme value domains, viz. Weibull, Gumbel, and Fréchet. We work in the strong selection-weak mutation regime in which beneficial mutations fix sequentially, and the population performs an uphill walk on the fitness landscape until a local fitness peak is reached. A striking prediction of our analysis is that the fitness difference between successive steps follows a pattern of diminishing returns in the Weibull domain and accelerating returns in the Fréchet domain, as the initial fitness of the population is increased. These trends are found to be robust with respect to fitness correlations. We believe that this result can be exploited in experiments to determine the extreme value domain of the distribution of beneficial fitness effects. Our work here differs significantly from the previous ones that assume the selection coefficient to be small. On taking large effect mutations into account, we find that the length of the walk shows different qualitative trends from those derived using small selection coefficient approximation.

  3. ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS

    SciTech Connect

    Tuffner, Francis K.; Singh, Ruchi

    2011-08-09

    Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).

  4. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  5. Discriminating topology in galaxy distributions using network analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungryong; Coutinho, Bruno C.; Dey, Arjun; Barabási, Albert-L.; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl

    2016-07-01

    The large-scale distribution of galaxies is generally analysed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a Lévy walk. For the cosmological simulation, we adopt the redshift z = 0.58 slice from Illustris and select galaxies with stellar masses greater than 108 M⊙. The two-point correlation function of these simulated galaxies follows a single power law, ξ(r) ˜ r-1.5. Then, we generate Lévy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point distributions have the same abundance and two-point correlation function, their spatial distributions are very different; most prominently, filamentary structures, absent in Lévy fractals. To quantify these missing topologies, we adopt network analysis tools and measure diameter, giant component, and transitivity from networks built by a conventional friends-of-friends recipe with various linking lengths. Unlike the abundance and two-point correlation function, these network quantities reveal a clear separation between the two simulated distributions; therefore, the galaxy distribution simulated by Illustris is not a Lévy fractal quantitatively. We find that the described network quantities offer an efficient tool for discriminating topologies and for comparing observed and theoretical distributions.

  6. A description of the failure distributions of selected Minuteman 3 guidance system electronic cards

    NASA Astrophysics Data System (ADS)

    Sisk, Albert E.

    1986-09-01

    This research described the failure distributions of selected Minuteman 3 guidance electronic cards and was the first attempt to use the Total Time on Test graphical technique to detect failure patterns. The data analysis was performed by using a Zenith 100 computer program that performed the Total Time on Test calculations and the Reliability Data Acquisition and Analysis Techniques software package. The objective of the research were to: (1) describe the failure distributions of selected Minuteman 3 electronic cards, (2) determine if the corresponding hazard function demonstrated infant mortality, useful life, or wearout, and (3) suggest management strategies to deal with wearout or infant mortality. Five individual cards were selected and the first three lifetimes of each card were examined. Nine of the fifteen cards indicated an exponential failure distribution, the other six were identified as either a Weibull or a normal failure distribution.

  7. Distribution-free mediation analysis for nonlinear models with confounding.

    PubMed

    Albert, Jeffrey M

    2012-11-01

    Recently, researchers have used a potential-outcome framework to estimate causally interpretable direct and indirect effects of an intervention or exposure on an outcome. One approach to causal-mediation analysis uses the so-called mediation formula to estimate the natural direct and indirect effects. This approach generalizes the classical mediation estimators and allows for arbitrary distributions for the outcome variable and mediator. A limitation of the standard (parametric) mediation formula approach is that it requires a specified mediator regression model and distribution; such a model may be difficult to construct and may not be of primary interest. To address this limitation, we propose a new method for causal-mediation analysis that uses the empirical distribution function, thereby avoiding parametric distribution assumptions for the mediator. To adjust for confounders of the exposure-mediator and exposure-outcome relationships, inverse-probability weighting is incorporated based on a supplementary model of the probability of exposure. This method, which yields the estimates of the natural direct and indirect effects for a specified reference group, is applied to data from a cohort study of dental caries in very-low-birth-weight adolescents to investigate the oral-hygiene index as a possible mediator. Simulation studies show low bias in the estimation of direct and indirect effects in a variety of distribution scenarios, whereas the standard mediation formula approach can be considerably biased when the distribution of the mediator is incorrectly specified.

  8. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.

  9. Global NLO Analysis of Nuclear Parton Distribution Functions

    SciTech Connect

    Hirai, M.; Kumano, S.; Nagai, T.-H.

    2008-02-21

    Nuclear parton distribution functions (NPDFs) are determined by a global analysis of experimental measurements on structure-function ratios F{sub 2}{sup A}/F{sub 2}{sup A{sup '}} and Drell-Yan cross section ratios {sigma}{sub DY}{sup A}/{sigma}{sub DY}{sup A{sup '}}, and their uncertainties are estimated by the Hessian method. The NPDFs are obtained in both leading order (LO) and next-to-leading order (NLO) of {alpha}{sub s}. As a result, valence-quark distributions are relatively well determined, whereas antiquark distributions at x>0.2 and gluon distributions in the whole x region have large uncertainties. The NLO uncertainties are slightly smaller than the LO ones; however, such a NLO improvement is not as significant as the nucleonic case.

  10. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  11. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  12. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  13. GIS-based poverty and population distribution analysis in China

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  14. Assessing tephra total grain-size distribution: Insights from field data analysis

    NASA Astrophysics Data System (ADS)

    Costa, A.; Pioli, L.; Bonadonna, C.

    2016-06-01

    The Total Grain-Size Distribution (TGSD) of tephra deposits is crucial for hazard assessment and provides fundamental insights into eruption dynamics. It controls both the mass distribution within the eruptive plume and the sedimentation processes and can provide essential information on the fragmentation mechanisms. TGSD is typically calculated by integrating deposit grain-size at different locations. The result of such integration is affected not only by the number, but also by the spatial distribution and distance from the vent of the sampling sites. In order to evaluate the reliability of TGSDs, we assessed representative sampling distances for pyroclasts of different sizes through dedicated numerical simulations of tephra dispersal. Results reveal that, depending on wind conditions, a representative grain-size distribution of tephra deposits down to ∼100 μm can be obtained by integrating samples collected at distances from less than one tenth up to a few tens of the column height. The statistical properties of TGSDs representative of a range of eruption styles were calculated by fitting the data with a few general distributions given by the sum of two log-normal distributions (bi-Gaussian in Φ-units), the sum of two Weibull distributions, and a generalized log-logistic distribution for the cumulative number distributions. The main parameters of the bi-lognormal fitting correlate with height of the eruptive columns and magma viscosity, allowing general relationships to be used for estimating TGSD generated in a variety of eruptive styles and for different magma compositions. Fitting results of the cumulative number distribution show two different power law trends for coarse and fine fractions of tephra particles, respectively. Our results shed light on the complex processes that control the size of particles being injected into the atmosphere during volcanic explosive eruptions and represent the first attempt to assess TGSD on the basis of pivotal physical

  15. WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

    EPA Science Inventory

    The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

  16. Data synthesis and display programs for wave distribution function analysis

    NASA Technical Reports Server (NTRS)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  17. Distributed intelligent data analysis in diabetic patient management.

    PubMed Central

    Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.

    1996-01-01

    This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655

  18. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    PubMed

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  19. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    PubMed Central

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  20. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    PubMed

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-08-18

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  1. Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers.

    PubMed

    Markiewicz, Iwona; Strupczewski, Witold G; Bogdanowicz, Ewa; Kochanek, Krzysztof

    2015-01-01

    Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles.

  2. Integrating software architectures for distributed simulations and simulation analysis communities.

    SciTech Connect

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  3. Robust Parameter Estimation for the Mixed Weibull (Seven Parameter) Including the Method of Minimum Likelihood and the Method of Minimum Distance

    DTIC Science & Technology

    1997-03-01

    TECHNOLOGY Wright-Patterson Air Force Base, Ohio DTGW.*1Ab-Q AFIT/GOR/ENY/97M- 1 ROBUST PARAMETER ESTIMATION FOR THE MIXED WEIBULL (SEVEN PARAMETER...of the Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the Degree of...Force Instititute of Technology (1986). Bergman, B. "Estimation of Weibull Parameters using a weight function." Journal of Material Science Letters

  4. Volumetric relief map for intracranial cerebrospinal fluid distribution analysis.

    PubMed

    Lebret, Alain; Kenmochi, Yukiko; Hodel, Jérôme; Rahmouni, Alain; Decq, Philippe; Petit, Éric

    2015-09-01

    Cerebrospinal fluid imaging plays a significant role in the clinical diagnosis of brain disorders, such as hydrocephalus and Alzheimer's disease. While three-dimensional images of cerebrospinal fluid are very detailed, the complex structures they contain can be time-consuming and laborious to interpret. This paper presents a simple technique that represents the intracranial cerebrospinal fluid distribution as a two-dimensional image in such a way that the total fluid volume is preserved. We call this a volumetric relief map, and show its effectiveness in a characterization and analysis of fluid distributions and networks in hydrocephalus patients and healthy adults.

  5. Influence Of Lateral Load Distributions On Pushover Analysis Effectiveness

    NASA Astrophysics Data System (ADS)

    Colajanni, P.; Potenzone, B.

    2008-07-01

    The effectiveness of two simple load distributions for pushover analysis recently proposed by the authors is investigated through a comparative study, involving static and dynamic analyses of seismic response of eccentrically braced frames. It is shown that in the upper floors only multimodal pushover procedures provide results close to the dynamic profile, while the proposed load patterns are always conservative in the lower floors. They over-estimate the seismic response less than the uniform distribution, representing a reliable alternative to the uniform or more sophisticated adaptive procedures proposed by seismic codes.

  6. Vibrational energy distribution analysis (VEDA): scopes and limitations.

    PubMed

    Jamróz, Michał H

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  7. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    NASA Astrophysics Data System (ADS)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  8. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  9. Reliability Estimation and Failure Analysis of Multilayer Ceramic Chip Capacitors

    NASA Astrophysics Data System (ADS)

    Yang, Seok Jun; Kim, Jin Woo; Ryu, Dong Su; Kim, Myung Soo; Jang, Joong Soon

    This paper presents the failure analysis and the reliability estimation of a multilayer ceramic chip capacitor. For the failed samples used in an automobile engine control unit, failure analysis was made to identify the root cause of failure and it was shown that the migration and the avalanche breakdown were the dominant failure mechanisms. Next, an accelerated life testing was designed to estimate the life of the MLCC. It is assumed that Weibull lifetime distribution and the life-stress relationship proposed Prokopowicz and Vaskas. The life-stress relationship and the acceleration factor are estimated by analyzing the accelerated life test data.

  10. Numerical analysis of the dynamics of distributed vortex configurations

    NASA Astrophysics Data System (ADS)

    Govorukhin, V. N.

    2016-08-01

    A numerical algorithm is proposed for analyzing the dynamics of distributed plane vortex configurations in an inviscid incompressible fluid. At every time step, the algorithm involves the computation of unsteady vortex flows, an analysis of the configuration structure with the help of heuristic criteria, the visualization of the distribution of marked particles and vorticity, the construction of streamlines of fluid particles, and the computation of the field of local Lyapunov exponents. The inviscid incompressible fluid dynamic equations are solved by applying a meshless vortex method. The algorithm is used to investigate the interaction of two and three identical distributed vortices with various initial positions in the flow region with and without the Coriolis force.

  11. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  12. GIS analysis of fluvial knickzone distribution in Japanese mountain watersheds

    NASA Astrophysics Data System (ADS)

    Hayakawa, Yuichi S.; Oguchi, Takashi

    2009-10-01

    Although a knickzone, a location at which stream gradient is locally large and intense erosion occurs, has been regarded as an important geomorphic feature in bedrock river morphology, the distribution of knickzones has not been well investigated especially for broad area. This study examines the distribution of fluvial knickzones along mountain rivers for the entire Japanese Archipelago. Whereas conventional manual methods of identifying knickzones based on map readings or field observations tend to be subjective and are impractical for a broad-scale analysis, this study employs a semi-automated method of knickzone extraction using DEMs and GIS. In a recent study by the authors, this method has been shown to enable efficient examination of knickzone distribution over a broad area. Investigations on major mountain rivers revealed that knickzones are generally abundant in upstream steep river reaches, suggesting hydraulic origins for the knickzones. The broad presence of such knickzones in the steep Japanese mountain rivers indicates that rivers subjected to active erosion show complex morphology induced by natural irregularities of water flow hydraulics as well as various environmental perturbations such as climatic changes. There also seems to be a characteristic frequency of knickzone distribution common to moderately steep to very steep bedrock reaches in Japan. Although volcanic products such as lavas and welded pyroclastic-flow deposits in valleys can cause distinct knickzones, substrate geology plays only a limited role in determining the distribution and form of knickzones.

  13. Subjective Probability Distribution Elicitation in Cost Risk Analysis: A Review

    DTIC Science & Technology

    2007-01-01

    where reasonable, to counteract known biases in elicitation). 1 For the triangle distribution, the probability is set to zero outside the endpoints...probability is set to zero outside the endpoints, while between the endpoints the density rises linearly from the lower value to the most-likely values...Wheeler, T. A., S. C. Hora , W. R. Cramond, and S. D. Unwin, Analysis of Core Damage Frequency from Internal Events: Expert Judgment Elicitation

  14. Electrical Power Distribution and Control Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

    2001-01-01

    This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

  15. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  16. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  17. Human leptospirosis distribution pattern analysis in Hulu Langat, Selangor

    NASA Astrophysics Data System (ADS)

    Zulkifli, Zuhafiza; Shariff, Abdul Rashid Mohamed; Tarmidi, Zakri M.

    2016-06-01

    This paper discussed the distribution pattern of human leptospirosis in the Hulu Langat District, Selangor, Malaysia. The data used in this study is leptospirosis cases’ report, and spatial boundaries. Leptospirosis cases, data were collected from Health Office of Hulu Langat and spatial boundaries, including lot and district boundaries was collected from the Department of Mapping and Surveying Malaysia (JUPEM). A total of 599 leptospirosis cases were reported in 2013, and this data was mapped based on the addresses provided in the leptospirosis cases’ report. This study uses three statistical methods to analyze the distribution pattern; Moran's I, average nearest neighborhood (ANN) and kernel density estimation. The analysis was used to determine the spatial distribution and the average distance of leptospirosis cases and located the hotspot locations. Using Moran's I analysis, results indicated the cases were random, with a value of -0.202816 which show negative spatial autocorrelation exist among leptospirosis cases. The ANN analysis result, indicated the cases are in cluster pattern, with value of the average nearest neighbor ratio is -21.80. And results also show the hotspots are has been identified and mapped in the Hulu Langat District.

  18. Distributed and interactive visual analysis of omics data.

    PubMed

    Farag, Yehia; Berven, Frode S; Jonassen, Inge; Petersen, Kjell; Barsnes, Harald

    2015-11-03

    The amount of publicly shared proteomics data has grown exponentially over the last decade as the solutions for sharing and storing the data have improved. However, the use of the data is often limited by the manner of which it is made available. There are two main approaches: download and inspect the proteomics data locally, or interact with the data via one or more web pages. The first is limited by having to download the data and thus requires local computational skills and resources, while the latter most often is limited in terms of interactivity and the analysis options available. A solution is to develop web-based systems supporting distributed and fully interactive visual analysis of proteomics data. The use of a distributed architecture makes it possible to perform the computational analysis at the server, while the results of the analysis can be displayed via a web browser without the need to download the whole dataset. Here the challenges related to developing such systems for omics data will be discussed. Especially how this allows for multiple connected interactive visual displays of omics dataset in a web-based setting, and the benefits this provide for computational analysis of proteomics data.This article is part of a Special Issue entitled: Computational Proteomics.

  19. Development of a site analysis tool for distributed wind projects

    SciTech Connect

    Shaw, Shawn

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  20. Componential distribution analysis of food using near infrared ray image

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  1. Scalable Visual Reasoning: Supporting Collaboration through Distributed Analysis

    SciTech Connect

    Pike, William A.; May, Richard A.; Baddeley, Bob; Riensche, Roderick M.; Bruce, Joe; Younkin, Katarina

    2007-05-21

    We present a visualization environment called the Scalable Reasoning System (SRS) that provides a suite of tools for the collection, analysis, and dissemination of reasoning products. This environment is designed to function across multiple platforms, bringing the display of visual information and the capture of reasoning associated with that information to both mobile and desktop clients. The service-oriented architecture of SRS promotes collaboration and interaction between users regardless of their location or platform. Visualization services allow data processing to be centralized and analysis results collected from distributed clients in real time. We use the concept of “reasoning artifacts” to capture the analytic value attached to individual pieces of information and collections thereof, helping to fuse the foraging and sense-making loops in information analysis. Reasoning structures composed of these artifacts can be shared across platforms while maintaining references to the analytic activity (such as interactive visualization) that produced them.

  2. Spatial Distribution Balance Analysis of Hospitals in Wuhan

    PubMed Central

    Yang, Nai; Chen, Shiyi; Hu, Weilu; Wu, Zhongheng; Chao, Yi

    2016-01-01

    The spatial distribution pattern of hospitals in Wuhan indicates a core in the central urban areas and a sparse distribution in the suburbs, particularly at the center of suburbs. This study aims to improve the gravity and Huff models to analyze healthcare accessibility and resources. Results indicate that healthcare accessibility in central urban areas is better than in the suburbs, where it increasingly worsens for the suburbs. A shortage of healthcare resources is observed in large-scale and high-class hospitals in central urban areas, whereas the resources of some hospitals in the suburbs are redundant. This study proposes the multi-criteria evaluation (MCE) analysis model for the location assessment in constructing new hospitals, which can effectively ameliorate healthcare accessibility in suburban areas. This study presents implications for the planning of urban healthcare facilities. PMID:27706069

  3. Growing axons analysis by using Granulometric Size Distribution

    NASA Astrophysics Data System (ADS)

    Gonzalez, Mariela A.; Ballarin, Virginia L.; Rapacioli, Melina; Celín, A. R.; Sánchez, V.; Flores, V.

    2011-09-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  4. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    PubMed

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  5. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  6. Job optimization in ATLAS TAG-based distributed analysis

    NASA Astrophysics Data System (ADS)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  7. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  8. Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

    NASA Astrophysics Data System (ADS)

    Yang, J.; Castelli, F.; Chen, Y.

    2014-10-01

    Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ɛ-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more

  9. Numerical analysis of decoy state quantum key distribution protocols

    SciTech Connect

    Harrington, Jim W; Rice, Patrick R

    2008-01-01

    Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

  10. EST analysis pipeline: use of distributed computing resources.

    PubMed

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  11. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    NASA Technical Reports Server (NTRS)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  12. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  13. Efficient network meta-analysis: a confidence distribution approach*

    PubMed Central

    Yang, Guang; Liu, Dungang; Liu, Regina Y.; Xie, Minge; Hoaglin, David C.

    2014-01-01

    Summary Network meta-analysis synthesizes several studies of multiple treatment comparisons to simultaneously provide inference for all treatments in the network. It can often strengthen inference on pairwise comparisons by borrowing evidence from other comparisons in the network. Current network meta-analysis approaches are derived from either conventional pairwise meta-analysis or hierarchical Bayesian methods. This paper introduces a new approach for network meta-analysis by combining confidence distributions (CDs). Instead of combining point estimators from individual studies in the conventional approach, the new approach combines CDs which contain richer information than point estimators and thus achieves greater efficiency in its inference. The proposed CD approach can e ciently integrate all studies in the network and provide inference for all treatments even when individual studies contain only comparisons of subsets of the treatments. Through numerical studies with real and simulated data sets, the proposed approach is shown to outperform or at least equal the traditional pairwise meta-analysis and a commonly used Bayesian hierarchical model. Although the Bayesian approach may yield comparable results with a suitably chosen prior, it is highly sensitive to the choice of priors (especially the prior of the between-trial covariance structure), which is often subjective. The CD approach is a general frequentist approach and is prior-free. Moreover, it can always provide a proper inference for all the treatment effects regardless of the between-trial covariance structure. PMID:25067933

  14. Characterization of Flocs and Floc Size Distributions Using Image Analysis

    PubMed Central

    Sun, Siwei; Weber-Shirk, Monroe; Lion, Leonard W.

    2016-01-01

    Abstract A nonintrusive digital imaging process was developed to study particle size distributions created through flocculation and sedimentation. Quantification of particle size distributions under different operating conditions can be of use in the understanding of aggregation mechanisms. This process was calibrated by measuring standardized polystyrene particles of known size and was utilized to count and measure individual kaolin clay particles as well as aggregates formed by coagulation with polyaluminum chloride and flocculation. Identification of out-of-focus flocs was automated with LabVIEW and used to remove them from the database that was analyzed. The particle diameter of the test suspension of kaolinite clay was measured to be 7.7 ± 3.8 μm and a linear relationship was obtained between turbidity and the concentration of clay particles determined by imaging. The analysis technique was applied to characterize flocs and floc particle size distribution as a function of coagulant dose. Removal of flocs by sedimentation was characterized by imaging, and the negative logarithm of the fraction of turbidity remaining after settling had a linear relationship with the logarithm of aluminum dose. The maximum floc size observed in the settled water was less than 120 μm, which was in accordance with the value predicted by a model for the capture velocity of the experimental tube settler of 0.21 mm/s. PMID:26909006

  15. Characterization of Flocs and Floc Size Distributions Using Image Analysis.

    PubMed

    Sun, Siwei; Weber-Shirk, Monroe; Lion, Leonard W

    2016-01-01

    A nonintrusive digital imaging process was developed to study particle size distributions created through flocculation and sedimentation. Quantification of particle size distributions under different operating conditions can be of use in the understanding of aggregation mechanisms. This process was calibrated by measuring standardized polystyrene particles of known size and was utilized to count and measure individual kaolin clay particles as well as aggregates formed by coagulation with polyaluminum chloride and flocculation. Identification of out-of-focus flocs was automated with LabVIEW and used to remove them from the database that was analyzed. The particle diameter of the test suspension of kaolinite clay was measured to be 7.7 ± 3.8 μm and a linear relationship was obtained between turbidity and the concentration of clay particles determined by imaging. The analysis technique was applied to characterize flocs and floc particle size distribution as a function of coagulant dose. Removal of flocs by sedimentation was characterized by imaging, and the negative logarithm of the fraction of turbidity remaining after settling had a linear relationship with the logarithm of aluminum dose. The maximum floc size observed in the settled water was less than 120 μm, which was in accordance with the value predicted by a model for the capture velocity of the experimental tube settler of 0.21 mm/s.

  16. Three-parameter probability distribution density for statistical image analysis

    NASA Astrophysics Data System (ADS)

    Schau, H. C.

    1980-01-01

    Statistical analysis of 2-D image data or data gathered from a scanning radiometer requires that both the non-Gaussian nature and finite sample size of the process be considered. To aid the statistical analysis of this data, a higher moment description density function has been defined, and parameters have been identified with the estimated moments of the data. It is shown that the first two moments may be computed from a knowledge of the Weiner spectrum, whereas all higher moments require the complex spatial frequency spectrum. Parameter identification is carried out for a three-parameter density function and applied to a scene in the IR region, 8-14 microns. Results indicate that a three-parameter distribution density generally provides different probabilities than does a two-parameter Gaussian description if maximum entropy (minimum bias) forms are sought.

  17. [Ordination analysis on relationship between bryophyte distribution and climatic factors].

    PubMed

    Cao, T; Guo, S; Gao, C

    2000-10-01

    Based on the data of climate and bryoflora in 21 mountainous regions of China, 61 moss families, 23 genera of Dicranaceae, 17 species of genus Campylopus and 35 species of genus Dicranum were analyzed by Canonical Correspond Analysis(CCA) and Detrended Canonical Correspond Analysis (DDCA) to reveal their distribution relationships with nine climatic factors, including annual average temperature, January average temperature, July average temperature, annual average rainfall, annual average fog days, annual average frost days and annual average light hours. The similarity of geographical elements among nine mountains in China and their relationships with climatic factors were also analyzed. The methods of applying DDCA and CCA to analyze the relationships between bryophyte and climatic factors were thus introduced. The studies indicate that CCA and DCCA are applicable in florology and phytogeography.

  18. Conditions for transmission path analysis in energy distribution models

    NASA Astrophysics Data System (ADS)

    Aragonès, Àngels; Guasch, Oriol

    2016-02-01

    In this work, we explore under which conditions transmission path analysis (TPA) developed for statistical energy analysis (SEA) can be applied to the less restrictive energy distribution (ED) models. It is shown that TPA can be extended without problems to proper-SEA systems whereas the situation is not so clear for quasi-SEA systems. In the general case, it has been found that a TPA can always be performed on an ED model if its inverse influence energy coefficient (EIC) matrix turns to have negative off-diagonal entries. If this condition is satisfied, it can be shown that the inverse EIC matrix automatically becomes an M-matrix. An ED graph can then be defined for it and use can be made of graph theory ranking path algorithms, previously developed for SEA systems, to classify dominant paths in ED models. A small mechanical system consisting of connected plates has been used to illustrate some of the exposed theoretical results.

  19. Distribution of Deformation on Cyprus, Inferences from Morphotectonic Analysis

    NASA Astrophysics Data System (ADS)

    Altinbas, Cevza; Yildirim, Cengiz; Tuysuz, Okan; Melnick, Daniel

    2016-04-01

    Cyprus is located on the subduction zone between African and Anatolian Plates. The topography of the island is a result of distributed deformation associated with the subduction related processes in the south of the Central Anatolian Plateau. Trodos and Kyrenia mountains are major morphotectonic units that integrally tied to plate boundary deformations. To elucidate the mode and pattern of active deformation and possible effects of subduction related processes on topography, we integrated morphometric and topographical analysis across the island. Our regional morphometric analysis rely on topographical swath profiles and topographic residuals to identify regional topographic anomalies, as well as steepness and concavity values of longitudinal river profiles that may reflect ongoing uplift. Accordingly, our swath profiles indicate an assymmetric topography across the Troodos Massif and Kyrenia Range. South of Trodos Massif indicates relatively less disected surfaces that partly associated with marine terraces of Quaternary. Our topographical resudial analysis indicate also strong relief assymmetry on the Troodos Massif that might be related to the Arakapas Fault and lithological contact between Neogene and Pre-Neogene rocks. In the north of the island the Kyrenia Range is characterized by a narrow, steep and long range that is delimited by the Ovgos Fault in the south. Our swath profiles across the range display also strong southward assymmetry. The southern flank is steeper in comparison to northern flank. The steepness index value of the rivers on the southern flank of the Kyrenia Range do not give strong signal along the Ovgos Fault. Neverthess, longitudinal profiles of rivers reveal evident deviations from degraded river profiles in the northern flank. Together with the presence of uplifted marine terraces along the northern flank that might indicate the presence of onshore structure(s) responsible for coastal uplift or regional uplift of the island because of

  20. A theoretical analysis of basin-scale groundwater temperature distribution

    NASA Astrophysics Data System (ADS)

    An, Ran; Jiang, Xiao-Wei; Wang, Jun-Zhi; Wan, Li; Wang, Xu-Sheng; Li, Hailong

    2015-03-01

    The theory of regional groundwater flow is critical for explaining heat transport by moving groundwater in basins. Domenico and Palciauskas's (1973) pioneering study on convective heat transport in a simple basin assumed that convection has a small influence on redistributing groundwater temperature. Moreover, there has been no research focused on the temperature distribution around stagnation zones among flow systems. In this paper, the temperature distribution in the simple basin is reexamined and that in a complex basin with nested flow systems is explored. In both basins, compared to the temperature distribution due to conduction, convection leads to a lower temperature in most parts of the basin except for a small part near the discharge area. There is a high-temperature anomaly around the basin-bottom stagnation point where two flow systems converge due to a low degree of convection and a long travel distance, but there is no anomaly around the basin-bottom stagnation point where two flow systems diverge. In the complex basin, there are also high-temperature anomalies around internal stagnation points. Temperature around internal stagnation points could be very high when they are close to the basin bottom, for example, due to the small permeability anisotropy ratio. The temperature distribution revealed in this study could be valuable when using heat as a tracer to identify the pattern of groundwater flow in large-scale basins. Domenico PA, Palciauskas VV (1973) Theoretical analysis of forced convective heat transfer in regional groundwater flow. Geological Society of America Bulletin 84:3803-3814

  1. Two-dimensional fluorescence intensity distribution analysis: theory and applications.

    PubMed

    Kask, P; Palo, K; Fay, N; Brand, L; Mets, U; Ullmann, D; Jungmann, J; Pschorr, J; Gall, K

    2000-04-01

    A method of sample analysis is presented which is based on fitting a joint distribution of photon count numbers. In experiments, fluorescence from a microscopic volume containing a fluctuating number of molecules is monitored by two detectors, using a confocal microscope. The two detectors may have different polarizational or spectral responses. Concentrations of fluorescent species together with two specific brightness values per species are determined. The two-dimensional fluorescence intensity distribution analysis (2D-FIDA), if used with a polarization cube, is a tool that is able to distinguish fluorescent species with different specific polarization ratios. As an example of polarization studies by 2D-FIDA, binding of 5'-(6-carboxytetramethylrhodamine) (TAMRA)-labeled theophylline to an anti-theophylline antibody has been studied. Alternatively, if two-color equipment is used, 2D-FIDA can determine concentrations and specific brightness values of fluorescent species corresponding to individual labels alone and their complex. As an example of two-color 2D-FIDA, binding of TAMRA-labeled somatostatin-14 to the human type-2 high-affinity somatostatin receptors present in stained vesicles has been studied. The presented method is unusually accurate among fluorescence fluctuation methods. It is well suited for monitoring a variety of molecular interactions, including receptors and ligands or antibodies and antigens.

  2. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Elmsheuser, Johannes; van der Ster, Daniel

    2012-12-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  3. Preliminary analysis of hub and spoke air freight distribution system

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1978-01-01

    A brief analysis is made of the hub and spoke air freight distribution system which would employ less than 15 hub centers world wide with very large advanced distributed-load freighters providing the line-haul delivery between hubs. This system is compared to a more conventional network using conventionally-designed long-haul freighters which travel between numerous major airports. The analysis calculates all of the transportation costs, including handling charges and pickup and delivery costs. The results show that the economics of the hub/spoke system are severely compromised by the extensive use of feeder aircraft to deliver cargo into and from the large freighter terminals. Not only are the higher costs for the smaller feeder airplanes disadvantageous, but their use implies an additional exchange of cargo between modes compared to truck delivery. The conventional system uses far fewer feeder airplanes, and in many cases, none at all. When feeder aircraft are eliminated from the hub/spoke system, however, that system is universally more economical than any conventional system employing smaller line-haul aircraft.

  4. Distributed Principal Component Analysis for Wireless Sensor Networks

    PubMed Central

    Le Borgne, Yann-Aël; Raybaud, Sylvain; Bontempi, Gianluca

    2008-01-01

    The Principal Component Analysis (PCA) is a data dimensionality reduction tech-nique well-suited for processing data from sensor networks. It can be applied to tasks like compression, event detection, and event recognition. This technique is based on a linear trans-form where the sensor measurements are projected on a set of principal components. When sensor measurements are correlated, a small set of principal components can explain most of the measurements variability. This allows to significantly decrease the amount of radio communication and of energy consumption. In this paper, we show that the power iteration method can be distributed in a sensor network in order to compute an approximation of the principal components. The proposed implementation relies on an aggregation service, which has recently been shown to provide a suitable framework for distributing the computation of a linear transform within a sensor network. We also extend this previous work by providing a detailed analysis of the computational, memory, and communication costs involved. A com-pression experiment involving real data validates the algorithm and illustrates the tradeoffs between accuracy and communication costs. PMID:27873788

  5. Effects of specimen size on the flexural strength and Weibull modulus of nuclear graphite IG-110, NBG-18, and PCEA

    NASA Astrophysics Data System (ADS)

    Chi, Se-Hwan

    2015-09-01

    Changes in flexural strength and Weibull modulus due to specimen size were investigated for three nuclear graphite grades, IG-110, NBG-18, and PCEA, using four-point-1/3 point (4-1/3) loading with specimens of three different sizes: 3.18 (Thickness) × 6.35 (Width) × 50.8 (Length), 6.50 (T) × 12.0 (W) × 52.0 (L), 18.0 (T) × 16.0 (W) × 64 (L) (mm) (total: 210 specimens). Results showed some specimen size effects were grade dependent: While NBG-18 (a) showed rather significant specimen size effects (37% difference between the 3 T and 18 T), the differences in IG-110 and PCEA were 7.6-15%. The maximum differences in flexural strength due to specimen size were larger in the PCEA and NBG-18 having larger sized coke particles (medium grain size: >300 μm) than the IG-110 with super fine coke particle size (25 μm). The Weibull modulus showed a data population dependency, in that it decreased with increasing numbers of data used for modulus determination. A good correlation between the fracture surface roughness and the flexural strength was confirmed.

  6. Estimation of genetic parameters for functional longevity in the South African Holstein cattle using a piecewise Weibull proportional hazards model.

    PubMed

    Imbayarwo-Chikosi, V E; Ducrocq, V; Banga, C B; Halimani, T E; van Wyk, J B; Maiwashe, A; Dzama, K

    2017-03-14

    Non-genetic factors influencing functional longevity and the heritability of the trait were estimated in South African Holsteins using a piecewise Weibull proportional hazards model. Data consisted of records of 161,222 of daughters of 2,051 sires calving between 1995 and 2013. The reference model included fixed time-independent age at first calving and time-dependent interactions involving lactation number, region, season and age of calving, within-herd class of milk production, fat and protein content, class of annual variation in herd size and the random herd-year effect. Random sire and maternal grandsire effects were added to the model to estimate genetic parameters. The within-lactation Weibull baseline hazards were assumed to change at 0, 270, 380 days and at drying date. Within-herd milk production class had the largest contribution to the relative risk of culling. Relative culling risk increased with lower protein and fat per cent production classes and late age at first calving. Cows in large shrinking herds also had high relative risk of culling. The estimate of the sire genetic variance was 0.0472 ± 0.0017 giving a theoretical heritability estimate of 0.11 in the complete absence of censoring. Genetic trends indicated an overall decrease in functional longevity of 0.014 standard deviation from 1995 to 2007. There are opportunities for including the trait in the breeding objective for South African Holstein cattle.

  7. Global sensitivity analysis in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Tsvetkova, O.; Ouarda, T. B.

    2012-12-01

    research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.

  8. Phylogenetic analysis reveals a scattered distribution of autumn colours

    PubMed Central

    Archetti, Marco

    2009-01-01

    Background and Aims Leaf colour in autumn is rarely considered informative for taxonomy, but there is now growing interest in the evolution of autumn colours and different hypotheses are debated. Research efforts are hindered by the lack of basic information: the phylogenetic distribution of autumn colours. It is not known when and how autumn colours evolved. Methods Data are reported on the autumn colours of 2368 tree species belonging to 400 genera of the temperate regions of the world, and an analysis is made of their phylogenetic relationships in order to reconstruct the evolutionary origin of red and yellow in autumn leaves. Key Results Red autumn colours are present in at least 290 species (70 genera), and evolved independently at least 25 times. Yellow is present independently from red in at least 378 species (97 genera) and evolved at least 28 times. Conclusions The phylogenetic reconstruction suggests that autumn colours have been acquired and lost many times during evolution. This scattered distribution could be explained by hypotheses involving some kind of coevolutionary interaction or by hypotheses that rely on the need for photoprotection. PMID:19126636

  9. Silk Fiber Mechanics from Multiscale Force Distribution Analysis

    PubMed Central

    Cetinkaya, Murat; Xiao, Senbo; Markert, Bernd; Stacklies, Wolfram; Gräter, Frauke

    2011-01-01

    Here we decipher the molecular determinants for the extreme toughness of spider silk fibers. Our bottom-up computational approach incorporates molecular dynamics and finite element simulations. Therefore, the approach allows the analysis of the internal strain distribution and load-carrying motifs in silk fibers on scales of both molecular and continuum mechanics. We thereby dissect the contributions from the nanoscale building blocks, the soft amorphous and the strong crystalline subunits, to silk fiber mechanics. We identify the amorphous subunits not only to give rise to high elasticity, but to also ensure efficient stress homogenization through the friction between entangled chains, which also allows the crystals to withstand stresses as high as 2 GPa in the context of the amorphous matrix. We show that the maximal toughness of silk is achieved at 10–40% crystallinity depending on the distribution of crystals in the fiber. We also determined a serial arrangement of the crystalline and amorphous subunits in lamellae to outperform a random or a parallel arrangement, putting forward what we believe to be a new structural model for silk and other semicrystalline materials. The multiscale approach, not requiring any empirical parameters, is applicable to other partially ordered polymeric systems. Hence, it is an efficient tool for the design of artificial silk fibers. PMID:21354403

  10. A Distributed Flocking Approach for Information Stream Clustering Analysis

    SciTech Connect

    Cui, Xiaohui; Potok, Thomas E

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  11. Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter Weibull Distribution.

    DTIC Science & Technology

    1981-12-01

    preparation of periodic reliability reports. Statistical testing, and in particular Monte Carlo sampling techniques, have proven quite useful when testing and...Reliability. Prepared for: U.S. Army Aviation Systems Command, R&M Division, Product Assurance Directorate (1977). 5. Callahan, J. C. Sequential Probability...Ohio: Air Force Institute of Technology, December 1976. 25 25. Technical Training, Reliability/Maintainability, Reliability. Stu Guide and Workbook

  12. Analysis of scale-invariant slab avalanche size distributions

    NASA Astrophysics Data System (ADS)

    Faillettaz, J.; Louchet, F.; Grasso, J.-R.; Daudon, D.

    2003-04-01

    Scale invariance of snow avalanche sizes was reported for the first time in 2001 by Louchet et al. at the EGS conference, using both acoustic emission duration, and the surface of the crown step left at the top of the starting zone, where the former parameter characterises the size of the total avalanche flow, and the latter that of the starting zone. The present paper focuses on parameters of the second type, that are more directly related to precise release mechanisms, vz. the crown crack length L, the crown crack or slab depth H, the crown step surface HxL, the volume HxL^2 of the snow involved in the starting zone, and LxH^2 which is a measure of the stress concentration at the basal crack tip at failure. The analysis is performed on two data sets, from la Grande Plagne (GP) and Tignes (T) ski resorts. For both catalogs, cumulative distributions of L, H, HxL, HxL^2 and LxH^2 are shown to be roughly linear in a log-log plot. i.e. consistent with so-called scale invariant (or power law) distributions for both triggered and natural avalanches. Plateaus are observed at small sizes, and roll-offs at large sizes. The power law exponents for each of these quantities are roughly independent of the ski resort (GP or T) they come from. In contrast, exponents for natural events are significantly smaller than those for artificial ones. We analyse the possible reasons for the scale invariance of these quantities, for the possible "universality" of the exponents corresponding to a given triggering mode, and for the difference in exponents between triggered and natural events. The physical meaning of the observed roll-offs and plateaus is also discussed. The power law distributions analysed here provide the occurrence probability of an avalanche of a given (starting) volume in a given time period on a given area. A possible use of this type of distributions for snow avalanche hazard assessment is contemplated, as it is for earthquakes or rockfalls.

  13. Reliability analysis of a structural ceramic combustion chamber

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Manderscheid, Jane M.; Freedman, Marc R.; Gyekenyesi, John P.

    1991-01-01

    The Weibull modulus, fracture toughness and thermal properties of a silicon nitride material used to make a gas turbine combustor were experimentally measured. The location and nature of failure origins resulting from bend tests were determined with fractographic analysis. The measured Weibull parameters were used along with thermal and stress analysis to determine failure probabilities of the combustor with the CARES design code. The effect of data censoring, FEM mesh refinement, and fracture criterion were considered in the analysis.

  14. Reliability analysis of a structural ceramic combustion chamber

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Manderscheid, Jane M.; Freedman, Marc R.; Gyekenyesi, John P.

    1990-01-01

    The Weibull modulus, fracture toughness and thermal properties of a silicon nitride material used to make a gas turbine combustor were experimentally measured. The location and nature of failure origins resulting from bend tests were determined with fractographic analysis. The measured Weibull parameters were used along with thermal and stress analysis to determine failure probabilities of the combustor with the CARES design code. The effect of data censoring, FEM mesh refinement, and fracture criterion were considered in the analysis.

  15. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  16. An Open Architecture for Distributed Malware Collection and Analysis

    NASA Astrophysics Data System (ADS)

    Cavalca, Davide; Goldoni, Emanuele

    Honeynets have become an important tool for researchers and network operators. However, the lack of a unified honeynet data model has impeded their effectiveness, resulting in multiple unrelated data sources, each with its own proprietary access method and format. Moreover, the deployment and management of a honeynet is a time-consuming activity and the interpretation of collected data is far from trivial. HIVE (Honeynet Infrastructure in Virtualized Environment) is a novel highly scalable automated data collection and analysis architecture we designed. Our infrastructure is based on top of proven FLOSS (Free, Libre and Open Source) solutions, which have been extended and integrated with new tools we developed. We use virtualization to ease honeypot management and deployment, combining both high-interaction and low-interaction sensors in a common infrastructure. We also address the need for rapid comprehension and detailed data analysis by harnessing the power of a relational database system, which provides centralized storage and access to the collected data while ensuring its constant integrity. This chapter presents our malware data collection architecture, offering some insight in the structure and benefits of a distributed virtualized honeynet and its development. Finally, we present some techniques for the active monitoring of centralized botnets we integrated in HIVE, which allow us to track the menaces evolution and timely deploy effective countermeasures.

  17. Directional spatial frequency analysis of lipid distribution in atherosclerotic plaque

    NASA Astrophysics Data System (ADS)

    Korn, Clyde; Reese, Eric; Shi, Lingyan; Alfano, Robert; Russell, Stewart

    2016-04-01

    Atherosclerosis is characterized by the growth of fibrous plaques due to the retention of cholesterol and lipids within the artery wall, which can lead to vessel occlusion and cardiac events. One way to evaluate arterial disease is to quantify the amount of lipid present in these plaques, since a higher disease burden is characterized by a higher concentration of lipid. Although therapeutic stimulation of reverse cholesterol transport to reduce cholesterol deposits in plaque has not produced significant results, this may be due to current image analysis methods which use averaging techniques to calculate the total amount of lipid in the plaque without regard to spatial distribution, thereby discarding information that may have significance in marking response to therapy. Here we use Directional Fourier Spatial Frequency (DFSF) analysis to generate a characteristic spatial frequency spectrum for atherosclerotic plaques from C57 Black 6 mice both treated and untreated with a cholesterol scavenging nanoparticle. We then use the Cauchy product of these spectra to classify the images with a support vector machine (SVM). Our results indicate that treated plaque can be distinguished from untreated plaque using this method, where no difference is seen using the spatial averaging method. This work has the potential to increase the effectiveness of current in-vivo methods of plaque detection that also use averaging methods, such as laser speckle imaging and Raman spectroscopy.

  18. Statistical distribution of mechanical properties for three graphite-epoxy material systems

    NASA Technical Reports Server (NTRS)

    Reese, C.; Sorem, J., Jr.

    1981-01-01

    Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

  19. Application of extreme learning machine for estimation of wind speed distribution

    NASA Astrophysics Data System (ADS)

    Shamshirband, Shahaboddin; Mohammadi, Kasra; Tong, Chong Wen; Petković, Dalibor; Porcu, Emilio; Mostafaeipour, Ali; Ch, Sudheer; Sedaghat, Ahmad

    2016-03-01

    The knowledge of the probabilistic wind speed distribution is of particular significance in reliable evaluation of the wind energy potential and effective adoption of site specific wind turbines. Among all proposed probability density functions, the two-parameter Weibull function has been extensively endorsed and utilized to model wind speeds and express wind speed distribution in various locations. In this research work, extreme learning machine (ELM) is employed to compute the shape ( k) and scale ( c) factors of Weibull distribution function. The developed ELM model is trained and tested based upon two widely successful methods used to estimate k and c parameters. The efficiency and accuracy of ELM is compared against support vector machine, artificial neural network and genetic programming for estimating the same Weibull parameters. The survey results reveal that applying ELM approach is eventuated in attaining further precision for estimation of both Weibull parameters compared to other methods evaluated. Mean absolute percentage error, mean absolute bias error and root mean square error for k are 8.4600 %, 0.1783 and 0.2371, while for c are 0.2143 %, 0.0118 and 0.0192 m/s, respectively. In conclusion, it is conclusively found that application of ELM is particularly promising as an alternative method to estimate Weibull k and c factors.

  20. Particle shape analysis of volcanic clast samples with the Matlab tool MORPHEO

    NASA Astrophysics Data System (ADS)

    Charpentier, Isabelle; Sarocchi, Damiano; Rodriguez Sedano, Luis Angel

    2013-02-01

    This paper presents a modular Matlab tool, namely MORPHEO, devoted to the study of particle morphology by Fourier analysis. A benchmark made of four sample images with different features (digitized coins, a pebble chart, gears, digitized volcanic clasts) is then proposed to assess the abilities of the software. Attention is brought to the Weibull distribution introduced to enhance fine variations of particle morphology. Finally, as an example, samples pertaining to a lahar deposit located in La Lumbre ravine (Colima Volcano, Mexico) are analysed. MORPHEO and the benchmark are freely available for research purposes.

  1. Time domain analysis of the weighted distributed order rheological model

    NASA Astrophysics Data System (ADS)

    Cao, Lili; Pu, Hai; Li, Yan; Li, Ming

    2016-11-01

    This paper presents the fundamental solution and relevant properties of the weighted distributed order rheological model in the time domain. Based on the construction of distributed order damper and the idea of distributed order element networks, this paper studies the weighted distributed order operator of the rheological model, a generalization of distributed order linear rheological model. The inverse Laplace transform on weighted distributed order operators of rheological model has been obtained by cutting the complex plane and computing the complex path integral along the Hankel path, which leads to the asymptotic property and boundary discussions. The relaxation response to weighted distributed order rheological model is analyzed, and it is closely related to many physical phenomena. A number of novel characteristics of weighted distributed order rheological model, such as power-law decay and intermediate phenomenon, have been discovered as well. And meanwhile several illustrated examples play important role in validating these results.

  2. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity

    PubMed Central

    Englehardt, James D.

    2015-01-01

    Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263

  3. Weibull Multiplicative Model and Machine Learning Models for Full-Automatic Dark-Spot Detection from SAR Images

    NASA Astrophysics Data System (ADS)

    Taravat, A.; Del Frate, F.

    2013-09-01

    As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.

  4. Rod internal pressure quantification and distribution analysis using Frapcon

    SciTech Connect

    Bratton, Ryan N; Jessee, Matthew Anderson; Wieselquist, William A

    2015-09-01

    This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd

  5. A distributed analysis of Human impact on global sediment dynamics

    NASA Astrophysics Data System (ADS)

    Cohen, S.; Kettner, A.; Syvitski, J. P.

    2012-12-01

    Understanding riverine sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. Ever increasing human activity during the Anthropocene have affected sediment dynamics in two major ways: (1) an increase is hillslope erosion due to agriculture, deforestation and landscape engineering and (2) trapping of sediment in dams and other man-made reservoirs. The intensity and dynamics between these man-made factors vary widely across the globe and in time and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment flux and water discharge model (WBMsed) to compare a pristine (without human input) and disturbed (with human input) simulations. Using these 50 year simulations we will show and discuss the complex spatial and temporal patterns of human effect on riverine sediment flux and water discharge.

  6. Finite-key security analysis for multilevel quantum key distribution

    NASA Astrophysics Data System (ADS)

    Brádler, Kamil; Mirhosseini, Mohammad; Fickler, Robert; Broadbent, Anne; Boyd, Robert

    2016-07-01

    We present a detailed security analysis of a d-dimensional quantum key distribution protocol based on two and three mutually unbiased bases (MUBs) both in an asymptotic and finite-key-length scenario. The finite secret key rates (in bits per detected photon) are calculated as a function of the length of the sifted key by (i) generalizing the uncertainly relation-based insight from BB84 to any d-level 2-MUB QKD protocol and (ii) by adopting recent advances in the second-order asymptotics for finite block length quantum coding (for both d-level 2- and 3-MUB QKD protocols). Since the finite and asymptotic secret key rates increase with d and the number of MUBs (together with the tolerable threshold) such QKD schemes could in principle offer an important advantage over BB84. We discuss the possibility of an experimental realization of the 3-MUB QKD protocol with the orbital angular momentum degrees of freedom of photons.

  7. Fourier analysis of polar cap electric field and current distributions

    NASA Technical Reports Server (NTRS)

    Barbosa, D. D.

    1984-01-01

    A theoretical study of high-latitude electric fields and currents, using analytic Fourier analysis methods, is conducted. A two-dimensional planar model of the ionosphere with an enhanced conductivity auroral belt and field-aligned currents at the edges is employed. Two separate topics are treated. A field-aligned current element near the cusp region of the polar cap is included to investigate the modifications to the convection pattern by the east-west component of the interplanetary magnetic field. It is shown that a sizable one-cell structure is induced near the cusp which diverts equipotential contours to the dawnside or duskside, depending on the sign of the cusp current. This produces characteristic dawn-dusk asymmetries to the electric field that have been previously observed over the polar cap. The second topic is concerned with the electric field configuration obtained in the limit of perfect shielding, where the field is totally excluded equatorward of the auroral oval. When realistic field-aligned current distributions are used, the result is to produce severely distorted, crescent-shaped equipotential contours over the cap. Exact, analytic formulae applicable to this case are also provided.

  8. Distributions.

    ERIC Educational Resources Information Center

    Bowers, Wayne A.

    This monograph was written for the Conference of the New Instructional Materials in Physics, held at the University of Washington in summer, 1965. It is intended for students who have had an introductory college physics course. It seeks to provide an introduction to the idea of distributions in general, and to some aspects of the subject in…

  9. Analysis of gamma-ray burst duration distribution using mixtures of skewed distributions

    NASA Astrophysics Data System (ADS)

    Tarnopolski, M.

    2016-05-01

    Two classes of gamma-ray bursts (GRBs) have been confidently identified thus far and are prescribed to different physical scenarios - neutron star-neutron star or neutron star-black hole mergers, and collapse of massive stars, for short and long GRBs, respectively. A third, intermediate in duration class, was suggested to be present in previous catalogues, such as Burst Alert and Transient Source Explorer (BATSE) and Swift, based on statistical tests regarding a mixture of two or three lognormal distributions of T90. However, this might possibly not be an adequate model. This paper investigates whether the distributions of log T90 from BATSE, Swift, and Fermi are described better by a mixture of skewed distributions rather than standard Gaussians. Mixtures of standard normal, skew-normal, sinh-arcsinh and alpha-skew-normal distributions are fitted using a maximum likelihood method. The preferred model is chosen based on the Akaike information criterion. It is found that mixtures of two skew-normal or two sinh-arcsinh distributions are more likely to describe the observed duration distribution of Fermi than a mixture of three standard Gaussians, and that mixtures of two sinh-arcsinh or two skew-normal distributions are models competing with the conventional three-Gaussian in the case of BATSE and Swift. Based on statistical reasoning, and it is shown that other phenomenological models may describe the observed Fermi, BATSE, and Swift duration distributions at least as well as a mixture of standard normal distributions, and the existence of a third (intermediate) class of GRBs in Fermi data is rejected.

  10. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  11. Bayesian Analysis for Binomial Models with Generalized Beta Prior Distributions.

    ERIC Educational Resources Information Center

    Chen, James J.; Novick, Melvin, R.

    1984-01-01

    The Libby-Novick class of three-parameter generalized beta distributions is shown to provide a rich class of prior distributions for the binomial model that removes some restrictions of the standard beta class. A numerical example indicates the desirability of using these wider classes of densities for binomial models. (Author/BW)

  12. Movement of Fuel Ashore: Storage, Capacity, Throughput, and Distribution Analysis

    DTIC Science & Technology

    2015-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited MOVEMENT OF FUEL...3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MOVEMENT OF FUEL ASHORE: STORAGE, CAPACITY, THROUGHPUT, AND DISTRIBUTION...of fuel movement ashore using only the ship- to-shore connectors available to the MEB. 14. SUBJECT TERMS Marine Corps, fuel, energy

  13. Analysis Model for Domestic Hot Water Distribution Systems: Preprint

    SciTech Connect

    Maguire, J.; Krarti, M.; Fang, X.

    2011-11-01

    A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.

  14. An analysis of confidence limit calculations used in AAPM Task Group No. 119

    SciTech Connect

    Knill, Cory; Snyder, Michael

    2011-04-15

    Purpose: The report issued by AAPM Task Group No. 119 outlined a procedure for evaluating the effectiveness of IMRT commissioning. The procedure involves measuring gamma pass-rate indices for IMRT plans of standard phantoms and determining if the results fall within a confidence limit set by assuming normally distributed data. As stated in the TG report, the assumption of normally distributed gamma pass rates is a convenient approximation for commissioning purposes, but may not accurately describe the data. Here the authors attempt to better describe gamma pass-rate data by fitting it to different distributions. The authors then calculate updated confidence limits using those distributions and compare them to those derived using TG No. 119 method. Methods: Gamma pass-rate data from 111 head and neck patients are fitted using the TG No. 119 normal distribution, a truncated normal distribution, and a Weibull distribution. Confidence limits to 95% are calculated for each and compared. A more general analysis of the expected differences between the TG No. 119 method of determining confidence limits and a more time-consuming curve fitting method is performed. Results: The TG No. 119 standard normal distribution does not fit the measured data. However, due to the small range of measured data points, the inaccuracy of the fit has only a small effect on the final value of the confidence limits. The confidence limits for the 111 patient plans are within 0.1% of each other for all distributions. The maximum expected difference in confidence limits, calculated using TG No. 119's approximation and a truncated distribution, is 1.2%. Conclusions: A three-parameter Weibull probability distribution more accurately fits the clinical gamma index pass-rate data than the normal distribution adopted by TG No. 119. However, the sensitivity of the confidence limit on distribution fit is low outside of exceptional circumstances.

  15. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    NASA Astrophysics Data System (ADS)

    Furbish, David Jon; Schmeeckle, Mark W.; Schumer, Rina; Fathel, Siobhan L.

    2016-07-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  16. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  17. Analysis of DNS cache effects on query distribution.

    PubMed

    Wang, Zheng

    2013-01-01

    This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally.

  18. Analysis of temperature distribution in liquid-cooled turbine blades

    NASA Technical Reports Server (NTRS)

    Livingood, John N B; Brown, W Byron

    1952-01-01

    The temperature distribution in liquid-cooled turbine blades determines the amount of cooling required to reduce the blade temperature to permissible values at specified locations. This report presents analytical methods for computing temperature distributions in liquid-cooled turbine blades, or in simplified shapes used to approximate sections of the blade. The individual analyses are first presented in terms of their mathematical development. By means of numerical examples, comparisons are made between simplified and more complete solutions and the effects of several variables are examined. Nondimensional charts to simplify some temperature-distribution calculations are also given.

  19. Analysis of DNS Cache Effects on Query Distribution

    PubMed Central

    2013-01-01

    This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally. PMID:24396313

  20. Characteristics of service requests and service processes of fire and rescue service dispatch centers: analysis of real world data and the underlying probability distributions.

    PubMed

    Krueger, Ute; Schimmelpfeng, Katja

    2013-03-01

    A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.

  1. Flaw strength distributions and statistical parameters for ceramic fibers: The normal distribution

    NASA Astrophysics Data System (ADS)

    R'Mili, M.; Godin, N.; Lamon, J.

    2012-05-01

    The present paper investigates large sets of ceramic fibre failure strengths (500 to 1000 data) produced using tensile tests on tows that contained either 500 or 1000 filaments. The probability density function was determined through acoustic emission monitoring which allowed detection and counting of filament fractures. The statistical distribution of filament strengths was described using the normal distribution. The Weibull equation was then fitted to this normal distribution for estimation of statistical parameters. A perfect agreement between both distributions was obtained, and a quite negligible scatter in statistical parameters was observed, as opposed to the wide variability that is reported in the literature. Thus it was concluded that flaw strengths are distributed normally and that the statistical parameters that were derived are the true ones. In a second step, the conventional method of estimation of Weibull parameters was applied to these sets of data and, then, to subsets selected randomly. The influence of other factors involved in the conventional method of determination of statistical parameters is discussed. It is demonstrated that selection of specimens, sample size, and method of construction of so-called Weibull plots are responsible for statistical parameters variability.

  2. Determination analysis of energy conservation standards for distribution transformers

    SciTech Connect

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  3. A Guide for Analysis Using Advanced Distributed Simulation (ADS)

    DTIC Science & Technology

    1997-01-01

    within a broader analysis strategy, experimental design, exercise preparation and management , and post-exercise analysis. Because it is impossible to...34* Decisionmakers, such as program managers , who need to determine how ADS might support their analysis needs and how to interpret ADS analysis products... Management and System Acquisition. Contents Preface ..................................................... iii Figures

  4. Stability analysis of linear fractional differential system with distributed delays

    NASA Astrophysics Data System (ADS)

    Veselinova, Magdalena; Kiskinov, Hristo; Zahariev, Andrey

    2015-11-01

    In the present work we study the Cauchy problem for linear incommensurate fractional differential system with distributed delays. For the autonomous case with distributed delays with derivatives in Riemann-Liouville or Caputo sense, we establish sufficient conditions under which the zero solution is globally asymptotic stable. The established conditions coincide with the conditions which guaranty the same result in the particular case of system with constant delays and for the case of system without delays in the commensurate case too.

  5. Thermal Analysis of Antenna Structures. Part 2: Panel Temperature Distribution

    NASA Technical Reports Server (NTRS)

    Schonfeld, D.; Lansing, F. L.

    1983-01-01

    This article is the second in a series that analyzes the temperature distribution in microwave antennas. An analytical solution in a series form is obtained for the temperature distribution in a flat plate analogous to an antenna surface panel under arbitrary temperature and boundary conditions. The solution includes the effects of radiation and air convection from the plate. Good agreement is obtained between the numerical and analytical solutions.

  6. Nanocrystal size distribution analysis from transmission electron microscopy images

    NASA Astrophysics Data System (ADS)

    van Sebille, Martijn; van der Maaten, Laurens J. P.; Xie, Ling; Jarolimek, Karol; Santbergen, Rudi; van Swaaij, René A. C. M. M.; Leifer, Klaus; Zeman, Miro

    2015-12-01

    We propose a method, with minimal bias caused by user input, to quickly detect and measure the nanocrystal size distribution from transmission electron microscopy (TEM) images using a combination of Laplacian of Gaussian filters and non-maximum suppression. We demonstrate the proposed method on bright-field TEM images of an a-SiC:H sample containing embedded silicon nanocrystals with varying magnifications and we compare the accuracy and speed with size distributions obtained by manual measurements, a thresholding method and PEBBLES. Finally, we analytically consider the error induced by slicing nanocrystals during TEM sample preparation on the measured nanocrystal size distribution and formulate an equation to correct this effect.We propose a method, with minimal bias caused by user input, to quickly detect and measure the nanocrystal size distribution from transmission electron microscopy (TEM) images using a combination of Laplacian of Gaussian filters and non-maximum suppression. We demonstrate the proposed method on bright-field TEM images of an a-SiC:H sample containing embedded silicon nanocrystals with varying magnifications and we compare the accuracy and speed with size distributions obtained by manual measurements, a thresholding method and PEBBLES. Finally, we analytically consider the error induced by slicing nanocrystals during TEM sample preparation on the measured nanocrystal size distribution and formulate an equation to correct this effect. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06292f

  7. Is There a DDOC in the House?: An Analysis of the Deployment Distribution Operations Center

    DTIC Science & Technology

    2009-05-04

    2005 analysis of the status of Department of Defense (DOD) supply distribution processes, the Government Accounting Office (GAO) summarized the... Government Accountability Office, Defense Logistics DOD Has Begun to Improve Supply Distribution Operations, but Further Actions Are Needed to Sustain...Defense Logistics Agency DOD – Department of Defense DPO – Distribution Process Owner GAO – Government Accounting Office GCC- Geographic

  8. Methods for Dynamic Analysis of Distribution Feeders with High Penetration of PV Generators

    SciTech Connect

    Nagarajan, Adarsh; Ayyanar, Raja

    2016-11-21

    An increase in the number of inverter-interfaced photovoltaic (PV) generators on existing distribution feeders affects the design, operation, and control of the distribution systems. Existing distribution system analysis tools are capable of supporting only snapshot and quasi-static analyses. Capturing the dynamic effects of PV generators during the variation in distribution system states is necessary when studying the effects of controller bandwidths, multiple voltage correction devices, and anti-islanding. This work explores the use of dynamic phasors and differential algebraic equations (DAE) for impact analysis of PV generators on the existing distribution feeders.

  9. Distributional Assumptions in Educational Assessments Analysis: Normal Distributions versus Generalized Beta Distribution in Modeling the Phenomenon of Learning

    ERIC Educational Resources Information Center

    Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

    2013-01-01

    This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

  10. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  11. Electron microprobe analysis of elemental distribution in excavated human femurs

    SciTech Connect

    Lambert, J.B.; Simpson, S.V.; Buikstra, J.E.; Hanson, D.

    1983-12-01

    Elemental distributions have been determined for femur cross sections of eight individuals from the Gibson and Ledders Woodland sites. The analyses were obtained by x-ray fluorescence with a scanning electron microscope. Movement of an element from soil to bone should give rise to inhomogeneous distributions within the bone. We found that the distributions of zinc, strontium, and lead are homogeneous throughout the femur. In contrast, iron, aluminum, potassium, and manganese show clear buildup along the outer surface of the femur and sometimes along the inner (endosteal) surface, as the result of postmortem enrichment. The buildup penetrates 10-400 micron into the femur. The major elements calcium and sodium show homogeneous distributions, but considerable material could be lost by leaching (10-15%) without causing a palpable effect on the electron maps. Magnesium shows buildup on the outer edge of some samples. These results suggest that diagenetic contamination may exclude Fe, Al, K, Mn, and probably Mg from use as indicators of ancient data. The homogeneous distributions of Zn, Sr, and Pb suggest that these elements are not altered appreciably and may serve as useful dietary indicators.

  12. Statistical analysis and modelling of small satellite reliability

    NASA Astrophysics Data System (ADS)

    Guo, Jian; Monas, Liora; Gill, Eberhard

    2014-05-01

    This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

  13. Analysis of Low Probability of Intercept (LPI) Radar Signals Using the Wigner Distribution

    DTIC Science & Technology

    2002-09-01

    INTERCEPT ( LPI ) RADAR SIGNALS USING THE WIGNER DISTRIBUTION by Jen-Yu Gau September 2002 Thesis Advisor: Phillip E. Pace Thesis Co...Master’s Thesis 4. TITLE AND SUBTITLE: Analysis of Low Probability of Intercept ( LPI ) Radar Signals Using The Wigner Distribution 6. AUTHOR (S...distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT The parameters of Low Probability of Intercept ( LPI ) radar signals are hard to identify by

  14. Commentary on "Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data"

    ERIC Educational Resources Information Center

    Hayton, James C.

    2009-01-01

    In the article "Exploring the Sensitivity of Horn's Parallel Analysis to the Distributional Form of Random Data," Dinno (this issue) provides strong evidence that the distribution of random data does not have a significant influence on the outcome of the analysis. Hayton appreciates the thorough approach to evaluating this assumption, and agrees…

  15. A fractal approach to dynamic inference and distribution analysis.

    PubMed

    van Rooij, Marieke M J W; Nash, Bertha A; Rajaraman, Srinivasan; Holden, John G

    2013-01-01

    Event-distributions inform scientists about the variability and dispersion of repeated measurements. This dispersion can be understood from a complex systems perspective, and quantified in terms of fractal geometry. The key premise is that a distribution's shape reveals information about the governing dynamics of the system that gave rise to the distribution. Two categories of characteristic dynamics are distinguished: additive systems governed by component-dominant dynamics and multiplicative or interdependent systems governed by interaction-dominant dynamics. A logic by which systems governed by interaction-dominant dynamics are expected to yield mixtures of lognormal and inverse power-law samples is discussed. These mixtures are described by a so-called cocktail model of response times derived from human cognitive performances. The overarching goals of this article are twofold: First, to offer readers an introduction to this theoretical perspective and second, to offer an overview of the related statistical methods.

  16. Analysis and machine mapping of the distribution of band recoveries

    USGS Publications Warehouse

    Cowardin, L.M.

    1977-01-01

    A method of calculating distance and bearing from banding site to recovery location based on the solution of a spherical triangle is presented. X and Y distances on an ordinate grid were applied to computer plotting of recoveries on a map. The advantages and disadvantages of tables of recoveries by State or degree block, axial lines, and distance of recovery from banding site for presentation and comparison of the spatial distribution of band recoveries are discussed. A special web-shaped partition formed by concentric circles about the point of banding and great circles at 30-degree intervals through the point of banding has certain advantages over other methods. Comparison of distributions by means of a X? contingency test is illustrated. The statistic V = X?/N can be used as a measure of difference between two distributions of band recoveries and its possible use is illustrated as a measure of the degree of migrational homing.

  17. Analysis of Geographical Distribution Patterns in Plants Using Fractals

    NASA Astrophysics Data System (ADS)

    Bari, A.; Ayad, G.; Padulosi, S.; Hodgkin, T.; Martin, A.; Gonzalez-Andujar, J. L.; Brown, A. H. D.

    Geographical distribution patterns in plants have been observed since primeval times and have been used by plant explorers to trace the origin of plants species. These patterns embody the effects of fundamental law-like processes. Diversity in plants has also been found to be proportionate with the area, and this scaling behavior is also known as fractal behavior. In the present study, we use fractal geometry to analyze the distribution patterns of wild taxa of cowpea with the objective to locate where their diversity would be the highest to aid in the planning of targeted explorations and conservation measures.

  18. Metagenomic Analysis of Water Distribution System Bacterial Communities

    EPA Science Inventory

    The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...

  19. Analysis of the Relationship between Shared Leadership and Distributed Leadership

    ERIC Educational Resources Information Center

    Goksoy, Suleyman

    2016-01-01

    Problem Statement: The current study's purpose is: First, to examine the relationship between shared leadership and distributed leadership, which, despite having many similar aspects in theory and practice, are defined as separate concepts. Second, to compare the two approaches and dissipate the theoretical contradictions. In this sense, the main…

  20. Bayesian Analysis of the Mass Distribution of Neutron Stars

    NASA Astrophysics Data System (ADS)

    Valentim, Rodolfo; Horvath, Jorge E.; Rangel, Eraldo M.

    The distribution of masses for neutron stars is analyzed using the Bayesian statistical inference, evaluating the likelihood of two a priori gaussian peaks distribution by using fifty-five measured points obtained in a variety of systems. The results strongly suggest the existence of a bimodal distribution of the masses, with the first peak around 1.35M⊙ ± 0.06M⊙ and a much wider second peak at 1.73M⊙ ± 0.36M⊙. We compared the two gaussian's model centered at 1.35M⊙ and 1.55M⊙ against a "single gaussian" model with 1.50M⊙ ± 0.11M⊙ using 3σ that provided a wide peak covering objects the full range of observed of masses. In order to compare models, BIC (Baysesian Information Criterion) can be used and a strong evidence for two distributions model against one peak model was found. The results support earlier views related to the different evolutionary histories of the members for the first two peaks, which produces a natural separation (in spite that no attempt to "label" the systems has been made). However, the recently claimed low-mass group, possibly related to O - Mg - Ne core collapse events, has a monotonically decreasing likelihood and has not been identified within this sample.

  1. The Elementary and Secondary Education Act: A Distributional Analysis.

    ERIC Educational Resources Information Center

    Barkin, David; Hettich, Walter

    This study analyzes interstate redistribution of Federal tax money under Title One of the Elementary and Secondary Education Act of 1965. First, the consistency of the criteria used to distribute funds is studied to see if people of similar financial positions are treated "qually. Results show that when compared with an alternative--the…

  2. Analysis of dynamic foot pressure distribution and ground reaction forces

    NASA Astrophysics Data System (ADS)

    Ong, F. R.; Wong, T. S.

    2005-04-01

    The purpose of this study was to assess the relationship between forces derived from in-shoe pressure distribution and GRFs during normal gait. The relationship served to demonstrate the accuracy and reliability of the in-shoe pressure sensor. The in-shoe pressure distribution from Tekscan F-Scan system outputs vertical forces and Centre of Force (COF), while the Kistler force plate gives ground reaction forces (GRFs) in terms of Fz, Fx and Fy, as well as vertical torque, Tz. The two systems were synchronized for pressure and GRFs measurements. Data was collected from four volunteers through three trials for both left and right foot under barefoot condition with the in-shoe sensor. The forces derived from pressure distribution correlated well with the vertical GRFs, and the correlation coefficient (r2) was in the range of 0.93 to 0.99. This is a result of extended calibration, which improves pressure measurement to give better accuracy and reliability. The COF from in-shoe sensor generally matched well with the force plate COP. As for the maximum vertical torque at the forefoot during toe-off, there was no relationship with the pressure distribution. However, the maximum torque was shown to give an indication of the rotational angle of the foot.

  3. Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

    2009-01-01

    More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

  4. High Resolution PV Power Modeling for Distribution Circuit Analysis

    SciTech Connect

    Norris, B. L.; Dise, J. H.

    2013-09-01

    NREL has contracted with Clean Power Research to provide 1-minute simulation datasets of PV systems located at three high penetration distribution feeders in the service territory of Southern California Edison (SCE): Porterville, Palmdale, and Fontana, California. The resulting PV simulations will be used to separately model the electrical circuits to determine the impacts of PV on circuit operations.

  5. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  6. Adaptive Weibull Multiplicative Model and Multilayer Perceptron neural networks for dark-spot detection from SAR imagery.

    PubMed

    Taravat, Alireza; Oppelt, Natascha

    2014-12-02

    Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies.

  7. A two-scale Weibull approach to the failure of porous ceramic structures made by robocasting: possibilities and limits

    PubMed Central

    Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.

    2012-01-01

    This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936

  8. Asynchronous Distributed Estimation of Topic Models for Document Analysis

    DTIC Science & Technology

    2010-03-01

    eigenvalue decomposition of the data covariance matrix [16]. Latent Semantic Analysis (LSA) can be viewed as the application of PCA to documents [17]. When...document counts is decomposed via singular value decomposition, allowing documents to be mapped to a lower dimensional space. Probabilistic Latent Semantic Analysis...Springer New York, 2002. [17] S. Deerwester, S. Dumais, G. Furnas, T. Landauer, R. Harshman, Indexing by latent semantic analysis, Journal of the

  9. Specimen size effects on the compressive strength and Weibull modulus of nuclear graphite of different coke particle size: IG-110 and NBG-18

    NASA Astrophysics Data System (ADS)

    Chi, Se-Hwan

    2013-05-01

    The effects of specimen size on the compressive strength and Weibull modulus were investigated for nuclear graphite of different coke particle sizes: IG-110 and NBG-18 (average coke particle size for IG-110: 25 μm, NBG-18: 300 μm). Two types of cylindrical specimens, i.e., where the diameter to length ratio was 1:2 (ASTM C 695-91 type specimen, 1:2 specimen) or 1:1 (1:1 specimen), were prepared for six diameters (3, 4, 5, 10, 15, and 20 mm) and tested at room temperature (compressive strain rate: 2.08 × 10-4 s-1). Anisotropy was considered during specimen preparation for NBG-18. The results showed that the effects of specimen size appeared negligible for the compressive strength, but grade-dependent for the Weibull modulus. In view of specimen miniaturization, deviations from the ASTM C 695-91 specimen size requirements require an investigation into the effects of size for the grade of graphite of interest, and the specimen size effects should be considered for Weibull modulus determination.

  10. Directional data analysis under the general projected normal distribution.

    PubMed

    Wang, Fangpo; Gelfand, Alan E

    2013-07-01

    The projected normal distribution is an under-utilized model for explaining directional data. In particular, the general version provides flexibility, e.g., asymmetry and possible bimodality along with convenient regression specification. Here, we clarify the properties of this general class. We also develop fully Bayesian hierarchical models for analyzing circular data using this class. We show how they can be fit using MCMC methods with suitable latent variables. We show how posterior inference for distributional features such as the angular mean direction and concentration can be implemented as well as how prediction within the regression setting can be handled. With regard to model comparison, we argue for an out-of-sample approach using both a predictive likelihood scoring loss criterion and a cumulative rank probability score criterion.

  11. Distributed Tracking Fidelity-Metric Performance Analysis Using Confusion Matrices

    DTIC Science & Technology

    2012-07-01

    evaluation [3, 4], T2T developments [ 5 , 6], and simultaneous tracking and ID (STID) approaches [7, 8, 9, 10], we seek a method for distributed tracking...While there is an interest to process all the data in signal-level fusion, such as image fusion [ 31 ], the transmission of the data is limited by...combination. Section 2 describes the tracking metrics. Section 3 overviews the JBPDAF. Section 4 describes the CM DLF. Section 5 shows a performance

  12. A Distributed Datacube Analysis Service for Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Mahadevan, V.; Rosolowsky, E.

    2011-07-01

    Current- and next-generation radio telescopes are poised to produce data at an unprecedented rate. We are developing the cyberinfrastructure to enable distributed processing and storage of FITS data cubes from these telescopes. In this contribution, we will present the data storage and network infrastructure that enables efficient searching, extraction and transfer of FITS datacubes. The infrastructure combines the iRODS distributed data management with a custom spatially-enabled PostgreSQL database. The data management system ingests FITS cubes, automatically populating the metadata database using FITS header data. Queries to the metadata service return matching records using VOTable format. The iRODS system allows for a distributed network of fileservers to store large data sets redundantly with a minimum of upkeep. Transfers between iRODS data sites use parallel I/O streams for maximum speed. Files are staged to the optimal host for download by an end user. The service can automatically extract subregions of individual or adjacent cubes registered to user-defined astrometric grids using the Montage package. The data system can query multiple surveys and return spatially registered data cubes to the user. Future development will allow the data system to utilize distributed processing environment to analyze datasets, returning only the calculation results to the end user. This cyberinfrastructure project combines many existing, open-source packages into a single deployment of a data system. The codebase can also function on two-dimensional images. The project is funded by CANARIE under the Network-Enabled Platforms 2 program.

  13. Confluence Analysis for Distributed Programs: A Model-Theoretic Approach

    DTIC Science & Technology

    2011-12-18

    Dedalus language, we will refer to two run- ning examples for the remainder of the paper. Example 7. A simple asynchronous marriage ceremony...distributed commit protocols and marriage ceremonies [22]. For simplicity (and felicity), Example 7 presents a simple asynchronous voting program with a...fixed set of members: a bride and a groom. The marriage is off (runaway() is true) if one party says “I do” and the other does not. However, the Dedalus

  14. Analysis of phase distribution phenomena in microgravity environments

    NASA Technical Reports Server (NTRS)

    Lahey, Richard T., Jr.; Bonetto, F.

    1994-01-01

    The purpose of the research presented in this paper is to demonstrate the ability of multidimensional two-fluid models for bubbly two-phase flow to accurately predict lateral phase distribution phenomena in microgravity environments. If successful, this research should provide NASA with mechanistically-based analytical methods which can be used for multiphase space system design and evaluation, and should be the basis for future shuttle experiments for model verification.

  15. Agent-based reasoning for distributed multi-INT analysis

    NASA Astrophysics Data System (ADS)

    Inchiosa, Mario E.; Parker, Miles T.; Perline, Richard

    2006-05-01

    Fully exploiting the intelligence community's exponentially growing data resources will require computational approaches differing radically from those currently available. Intelligence data is massive, distributed, and heterogeneous. Conventional approaches requiring highly structured and centralized data will not meet this challenge. We report on a new approach, Agent-Based Reasoning (ABR). In NIST evaluations, the use of ABR software tripled analysts' solution speed, doubled accuracy, and halved perceived difficulty. ABR makes use of populations of fine-grained, locally interacting agents that collectively reason about intelligence scenarios in a self-organizing, "bottom-up" process akin to those found in biological and other complex systems. Reproduction rules allow agents to make inferences from multi-INT data, while movement rules organize information and optimize reasoning. Complementary deterministic and stochastic agent behaviors enhance reasoning power and flexibility. Agent interaction via small-world networks - such as are found in nervous systems, social networks, and power distribution grids - dramatically increases the rate of discovering intelligence fragments that usefully connect to yield new inferences. Small-world networks also support the distributed processing necessary to address intelligence community data challenges. In addition, we have found that ABR pre-processing can boost the performance of commercial text clustering software. Finally, we have demonstrated interoperability with Knowledge Engineering systems and seen that reasoning across diverse data sources can be a rich source of inferences.

  16. ROD INTERNAL PRESSURE QUANTIFICATION AND DISTRIBUTION ANALYSIS USING FRAPCON

    SciTech Connect

    Ivanov, Kostadin; Jessee, Matthew Anderson

    2016-01-01

    The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified forWatts BarNuclearUnit 1 (WBN1) fuel rods by modeling core cycle design data, intercycle assembly movements, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. An alternate model for the amount of helium released from zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layers is derived and applied to FRAPCON output data to quantify the RIP and CHS for these fuel rods. SCALE/Polaris is used to quantify fuel rod-specific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel blankets. Cumulative distribution functions (CDFs) are prepared from the distribution of RIP predictions for all standard and IFBA rods. The provided CDFs allow for the determination of the portion of WBN1 fuel rods that exceed a specified RIP limit. Lastly, improvements to the computational methodology of FRAPCON are proposed.

  17. Periodic analysis of total ozone and its vertical distribution

    NASA Technical Reports Server (NTRS)

    Wilcox, R. W.; Nastrom, G. D.; Belmont, A. D.

    1975-01-01

    Both total ozone and vertical distribution ozone data from the period 1957 to 1972 are analyzed. For total ozone, improved monthly zonal means for both hemispheres are computed by weighting individual station monthly means by a factor which compensates for the close grouping of stations in certain regions of latitude bands. Longitudinal variability show maxima in summer in both hemispheres, but, in winter, only in the Northern Hemisphere. The geographical distributions of the long term mean, and the annual, quasibiennial and semiannual waves in total ozone over the Northern Hemisphere are presented. The extratropical amplitude of the annual wave is by far the largest of the three, as much as 120 m atm cm over northern Siberia. There is a tendency for all three waves to have maxima in high latitudes. Monthly means of the vertical distribution of ozone determined from 3 to 8 years of ozonesonde data over North America are presented. Number density is highest in the Arctic near 18 km. The region of maximum number density slopes upward toward 10 N, where the long term mean is 45 x 10 to the 11th power molecules cm/3 near 26 km.

  18. Archiving, Distribution and Analysis of Solar-B Data

    NASA Astrophysics Data System (ADS)

    Shimojo, M.

    2007-10-01

    The Solar-B Mission Operation and Data Analysis (MODA) working group has been discussing the data analysis system for Solar-B data since 2001. In the paper, based on the Solar-B MODA document and the recent work in Japan, we introduce the dataflow from Solar-B to scientists, the data format and data-level of Solar-B data, and the data searching/providing system.

  19. Probability distribution analysis of force induced unzipping of DNA

    NASA Astrophysics Data System (ADS)

    Kumar, Sanjay; Giri, Debaprasad

    2006-07-01

    We present a semimicroscopic model of dsDNA by incorporating the directional nature of hydrogen bond to describe the force induced unzipping transition. Using exact enumeration technique, we obtain the force-temperature and the force-extension curves and compare our results with the other models of dsDNA. The model proposed by us is rich enough to describe the basic mechanism of dsDNA unzipping and predicts the existence of an "eye phase." We show oscillations in the probability distribution function during unzipping. Effects of stacking energies on the melting profile have also been studied.

  20. Statistical distributions of potential interest in ultrasound speckle analysis.

    PubMed

    Nadarajah, Saralees

    2007-05-21

    Compound statistical modelling of the uncompressed envelope of the backscattered signal has received much interest recently. In this note, a comprehensive collection of models is derived for the uncompressed envelope of the backscattered signal by compounding the Nakagami distribution with 13 flexible families. The corresponding estimation procedures are derived by the method of moments and the method of maximum likelihood. The sensitivity of the models to their various parameters is examined. It is expected that this work could serve as a useful reference and lead to improved modelling of the uncompressed envelope of the backscattered signal.

  1. Analysis of stress-rupture data from S-glass composites.

    NASA Technical Reports Server (NTRS)

    Robinson, E. Y.; Chiao, T. T.

    1972-01-01

    Both composite and free strands have been tested to study matrix effectiveness. Glass from different lots were used, and processing parameters were varied to demonstrate the effects of material variability. The statistical patterns that emerge from these experiments show that lifetime distributions are characteristically skewed. As a class, the lifetime distributions are well approximated by the reduced form of the Weibull distribution. The strength retention data display little early degradation. The retained-strength distributions can also be classed as Weibull distributions. A statistical model applied to the data gives predictions of progressive failure, strength retention path, and expected life for two basic types of internal fracture processes.

  2. Studying bubble-particle interactions by zeta potential distribution analysis.

    PubMed

    Wu, Chendi; Wang, Louxiang; Harbottle, David; Masliyah, Jacob; Xu, Zhenghe

    2015-07-01

    Over a decade ago, Xu and Masliyah pioneered an approach to characterize the interactions between particles in dynamic environments of multicomponent systems by measuring zeta potential distributions of individual components and their mixtures. Using a Zetaphoremeter, the measured zeta potential distributions of individual components and their mixtures were used to determine the conditions of preferential attachment in multicomponent particle suspensions. The technique has been applied to study the attachment of nano-sized silica and alumina particles to sub-micron size bubbles in solutions with and without the addition of surface active agents (SDS, DAH and DF250). The degree of attachment between gas bubbles and particles is shown to be a function of the interaction energy governed by the dispersion, electrostatic double layer and hydrophobic forces. Under certain chemical conditions, the attachment of nano-particles to sub-micron size bubbles is shown to be enhanced by in-situ gas nucleation induced by hydrodynamic cavitation for the weakly interacting systems, where mixing of the two individual components results in negligible attachment. Preferential interaction in complex tertiary particle systems demonstrated strong attachment between micron-sized alumina and gas bubbles, with little attachment between micron-sized alumina and silica, possibly due to instability of the aggregates in the shear flow environment.

  3. Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

    SciTech Connect

    Henning, Maria Florencia; Sanchez, Susana; Bakas, Laura

    2009-05-22

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

  4. Performance analysis of structured pedigree distributed fusion systems

    NASA Astrophysics Data System (ADS)

    Arambel, Pablo O.

    2009-05-01

    Structured pedigree is a way to compress pedigree information. When applied to distributed fusion systems, the approach avoids the well known problem of information double counting resulting from ignoring the cross-correlation among fused estimates. Other schemes that attempt to compute optimal fused estimates require the transmission of full pedigree information or raw data. This usually can not be implemented in practical systems because of the enormous requirements in communications bandwidth. The Structured Pedigree approach achieves data compression by maintaining multiple covariance matrices, one for each uncorrelated source in the network. These covariance matrices are transmitted by each node along with the state estimate. This represents a significant compression when compared to full pedigree schemes. The transmission of these covariance matrices (or a subset of these covariance matrices) allows for an efficient fusion of the estimates, while avoiding information double counting and guaranteeing consistency on the estimates. This is achieved by exploiting the additional partial knowledge on the correlation of the estimates. The approach uses a generalized version of the Split Covariance Intersection algorithm that applies to multiple estimates and multiple uncorrelated sources. In this paper we study the performance of the proposed distributed fusion system by analyzing a simple but instructive example.

  5. Structural reliability analysis of laminated CMC components

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.

    1991-01-01

    For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.

  6. A data analysis expert system for large established distributed databases

    NASA Technical Reports Server (NTRS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  7. Theoretical analysis of the particle gradient distribution in centrifugal field during solidification

    SciTech Connect

    Liu, Q.; Jiao, Y.; Yang, Y.; Hu, Z.

    1996-12-01

    A theoretical analysis is presented to obtain gradient distribution of particles in centrifugal field, by which the particle distribution in gradient composite can be predicted. Particle movement in liquid is described and gradient distribution of particles in composite is calculated in a centrifugal field during the solidification. The factors which affect the particle distribution and its gradient are discussed in detail. The theoretical analysis indicated that a composite zone and a blank zone exist in gradient composite, which can be controlled to the outside or inside of the tubular composite by the density difference of particle and liquid metal. The comparison of the SiC particle distribution in Al matrix composite produced by centrifugal casting between the theory model and the experiment denotes that the theoretical analysis is reasonable.

  8. Completion report harmonic analysis of electrical distribution systems

    SciTech Connect

    Tolbert, L.M.

    1996-03-01

    Harmonic currents have increased dramatically in electrical distribution systems in the last few years due to the growth in non-linear loads found in most electronic devices. Because electrical systems have been designed for linear voltage and current waveforms; (i.e. nearly sinusoidal), non-linear loads can cause serious problems such as overheating conductors or transformers, capacitor failures, inadvertent circuit breaker tripping, or malfunction of electronic equipment. The U.S. Army Center for Public Works has proposed a study to determine what devices are best for reducing or eliminating the effects of harmonics on power systems typical of those existing in their Command, Control, Communication and Intelligence (C3I) sites.

  9. Finite key analysis for symmetric attacks in quantum key distribution

    SciTech Connect

    Meyer, Tim; Kampermann, Hermann; Kleinmann, Matthias; Bruss, Dagmar

    2006-10-15

    We introduce a constructive method to calculate the achievable secret key rate for a generic class of quantum key distribution protocols, when only a finite number n of signals is given. Our approach is applicable to all scenarios in which the quantum state shared by Alice and Bob is known. In particular, we consider the six state protocol with symmetric eavesdropping attacks, and show that for a small number of signals, i.e., below n{approx}10{sup 4}, the finite key rate differs significantly from the asymptotic value for n{yields}{infinity}. However, for larger n, a good approximation of the asymptotic value is found. We also study secret key rates for protocols using higher-dimensional quantum systems.

  10. Southern Arizona riparian habitat: Spatial distribution and analysis

    NASA Technical Reports Server (NTRS)

    Lacey, J. R.; Ogden, P. R.; Foster, K. E.

    1975-01-01

    The objectives of this study were centered around the demonstration of remote sensing as an inventory tool and researching the multiple uses of riparian vegetation. Specific study objectives were to: (1) map riparian vegetation along the Gila River, San Simon Creek, San Pedro River, Pantano Wash, (2) determine the feasibility of automated mapping using LANDSAT-1 computer compatible tapes, (3) locate and summarize existing mpas delineating riparian vegetation, (4) summarize data relevant to Southern Arizona's riparian products and uses, (5) document recent riparian vegetation changes along a selected portion of the San Pedro River, (6) summarize historical changes in composition and distribution of riparian vegetation, and (7) summarize sources of available photography pertinent to Southern Arizona.

  11. Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

    NASA Technical Reports Server (NTRS)

    Ozguner, Fusun

    1996-01-01

    Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

  12. Legendre analysis of differential distributions in hadronic reactions

    NASA Astrophysics Data System (ADS)

    Azimov, Yakov I.; Strakovsky, Igor I.; Briscoe, William J.; Workman, Ron L.

    2017-02-01

    Modern experimental facilities have provided a tremendous volume of reaction data, often with wide energy and angular coverage, and with increasing precision. For reactions with two hadrons in the final state, these data are often presented as multiple sets of panels, with angular distributions at numerous specific energies. Such presentations have limited visual appeal, and their physical content is typically extracted through some model-dependent treatment. Instead, we explore the use of a Legendre series expansion with a relatively small number of essential coefficients. This approach has been applied in several recent experimental investigations. We present some general properties of the Legendre coefficients in the helicity framework and consider what physical information can be extracted without any model-dependent assumptions.

  13. Analysis of phase distribution phenomena in microgravity environments

    NASA Technical Reports Server (NTRS)

    Lahey, Richard, Jr.; Bonetto, Fabian

    1994-01-01

    In the past one of NASA's primary emphasis has been on identifying single and multiphase flow experiments which can produce new discoveries that are not possible except in a microgravity environment. While such experiments are obviously of great scientific interest, they do not necessarily provide NASA with the ability to use multiphase processes for power production and/or utilization in space. The purpose of the research presented in this paper is to demonstrate the ability of multidimensional two-fluid models for bubbly two-phase flow to accurately predict lateral phase distribution phenomena in microgravity environments. If successful, this research should provide NASA with mechanistically-based analytical methods which can be used for multiphase space design and evaluation, and should be the basis for future shuttle experiments for model verification.

  14. Temporal Distributions of Problem Behavior Based on Scatter Plot Analysis.

    ERIC Educational Resources Information Center

    Kahng, SungWoo; Iwata, Brian A.; Fischer, Sonya M.; Page, Terry J.; Treadwell, Kimberli R. H.; Williams, Don E.; Smith, Richard G.

    1998-01-01

    A large-scale analysis was conducted of problem behavior by observing 20 individuals living in residential facilities. Data were converted into scatter plot formats. When the data were transformed into aggregate "control charts," 12 of 15 sets of data revealed 30-minute intervals during which problem behavior was more likely to occur.…

  15. Industry sector analysis, Indonesia: Electric power distribution equipment. Export trade information

    SciTech Connect

    Sihombing, P.

    1991-08-20

    The market survey covers the electric power distribution equipment market in Indonesia. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Indonesian consumers to U.S. products; the competitive situation, and market access (tariffs, non-tariff barriers, standards, taxes, distribution channels). It also contains key contact information.

  16. Formal Methods for Quality of Service Analysis in Component-Based Distributed Computing

    DTIC Science & Technology

    2003-12-01

    Component-Based Software Architecture is a promising solution for distributed computing . To develop high quality software, analysis of non-functional...based distributed computing is proposed and represented formally using Two-Level Grammar (TLG), an object-oriented formal specification language. TLG

  17. Preliminary evaluation of diabatic heating distribution from FGGE level 3b analysis data

    NASA Technical Reports Server (NTRS)

    Kasahara, A.; Mizzi, A. P.

    1985-01-01

    A method is presented for calculating the global distribution of diabatic heating rate. Preliminary results of global heating rate evaluated from the European center for Medium Range Weather Forecasts Level IIIb analysis data is also presented.

  18. Advanced analysis of metal distributions in human hair

    SciTech Connect

    Kempson, Ivan M.; Skinner, William M.

    2008-06-09

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

  19. Managing large-scale multi-voltage distribution system analysis

    SciTech Connect

    Walton, C.M.

    1994-12-31

    The challenge for electricity utilities in the 90`s to deliver ever more reliable service at reduced cost and with fewer technical staff is the driver towards the next generation of automated network analysis tools. The paper discusses the application of an Automatic Loss Minimiser (ALM) and Fault Study Package (FSP) under the control of a Sequence Processor to an existing high resolution graphics network analysis package. Automated, sorted management summaries enable limited resources and system automation to be directed at those networks with the poorest performance and/or the largest potential savings from reduced system losses. The impact regular automatic monitoring has on the quality of the database and the scope for integration of the modules with other System Automation initiatives is also considered.

  20. Theoretical analysis of the Ca2+ spark amplitude distribution.

    PubMed Central

    Izu, L T; Wier, W G; Balke, C W

    1998-01-01

    A difficulty of using confocal microscopy to study Ca2+ sparks is the uncertainty of the linescan position with respect to the source of Ca2+ release. Random placement of the linescan is expected to result in a broad distribution of measured Ca2+ spark amplitudes (a) even if all Ca2+ sparks were generated identically. Thus variations in Ca2+ spark amplitude due to positional differences between confocal linescans and Ca2+ release site are intertwined with variations due to intrinsic differences in Ca2+ release properties. To separate these two sources of variations on the Ca2+ spark amplitude, we determined the effect changes of channel current or channel open time--collectively called the source strength, alpha--had on the measured Ca2+ spark amplitude histogram, N(a). This was done by 1) simulating Ca2+ release, Ca2+ and fluo-3 diffusion, and Ca2+ binding reactions; 2) simulation of image formation of the Ca2+ spark by a confocal microscope; and 3) using a novel automatic Ca2+ spark detector. From these results we derived an integral equation relating the probability density function of source strengths, f alpha (alpha), to N(a), which takes into account random positional variations between the source and linescan. In the special, but important, case that the spatial distribution of Ca(2+)-bound fluo-3 is Gaussian, we show the following: 1) variations of Ca2+ spark amplitude due to positional or intrinsic differences can be separated, and 2) f alpha (alpha) can, in principle, be calculated from the Ca2+ spark amplitude histogram since N(a) is the sum of shifted hyperbolas, where the magnitudes of the shifts and weights depend on f alpha (alpha). In particular, if all Ca2+ sparks were generated identically, then the plot of 1/N(a) against a will be a straight line. Multiple populations of channels carrying distinct currents are revealed by discontinuities in the 1/N(a) plot. 3) Although the inverse relationship between Ca2+ spark amplitude and decay time might be

  1. Evolution History of Asteroid Itokawa Based on Block Distribution Analysis

    NASA Astrophysics Data System (ADS)

    Mazrouei, Sara; Daly, Michael; Barnouin, Olivier; Ernst, Carolyn

    2013-04-01

    This work investigates trends in the global and regional distribution of blocks on asteroid 25143 Itokawa in order to discover new findings to better understand the history of this asteroid. Itokawa is a near-Earth object, and the first asteroid that was targeted for a sample return mission. Trends in block population provide new insights in regards to Itokawa's current appearance following the disruption of a possible parent body, and how its surface might have changed since then. Here blocks are defined as rocks or features with distinctive positive relief that are larger than a few meters in size. The size and distribution of blocks are measured by mapping the outline of the blocks using the Small Body Mapping Tool (SBMT) created by the Johns Hopkins University Applied Physics Laboratory [1]. The SBMT allows the user to overlap correctly geo-located Hayabusa images [2] onto the Itokawa shape model. This study provides additional inferences on the original disruption and subsequent re-accretion of Itokawa's "head" and "body" from block analyses. A new approach is taken by analyzing the population of blocks with respect to latitude for both Itokawa's current state, and a hypothetical elliptical body. Itokawa currently rotates approximately about its maximum moment of inertia, which is expected due to conservation of momentum and minimum energy arguments. After the possible disruption of the parent body of Itokawa, the "body" of Itokawa would have tended to a similar rotation. The shape of this body is made by removing the head of Itokawa and applying a semispherical cap. Using the method of [3] inertial properties of this object are calculated. With the assumption that this object had settled to its stable rotational axis, it is found that the pole axis could have been tilted about 13° away from the current axis in the direction opposite the head, equivalent to a 33 meter change in the center of mass. The results of this study provide means to test the hypothesis

  2. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  3. Motion synthesis and force distribution analysis for a biped robot.

    PubMed

    Trojnacki, Maciej T; Zielińska, Teresa

    2011-01-01

    In this paper, the method of generating biped robot motion using recorded human gait is presented. The recorded data were modified taking into account the velocity available for robot drives. Data includes only selected joint angles, therefore the missing values were obtained considering the dynamic postural stability of the robot, which means obtaining an adequate motion trajectory of the so-called Zero Moment Point (ZMT). Also, the method of determining the ground reaction forces' distribution during the biped robot's dynamic stable walk is described. The method was developed by the authors. Following the description of equations characterizing the dynamics of robot's motion, the values of the components of ground reaction forces were symbolically determined as well as the coordinates of the points of robot's feet contact with the ground. The theoretical considerations have been supported by computer simulation and animation of the robot's motion. This was done using Matlab/Simulink package and Simulink 3D Animation Toolbox, and it has proved the proposed method.

  4. Development of distribution system reliability and risk analysis models

    NASA Astrophysics Data System (ADS)

    Vismor, T. D.; Northcote-Green, J. E. D.; Kostyal, S. J.; Brooks, C. L.

    1981-08-01

    The two reliability models, their testing, and the modifications of a unified distribution planning model to calculate reliability indices are described. The historical reliability assessment model HISRAM is designed to suit most utilities. Four implementation levels with different input data requirements and output capabilities permit a utility to select a level appropriate to its needs. User-defined divisions, causes and output options further add to the program flexibility. A unique feature of HISRAM is the program generation of the appropriate outage reporting form following level selection and initialization. This allows the engineer to review data input requirements before field implementation. It has the capability of estimating component failure rates and restoration times upon provision of suitable input data. The predictive reliability assessment model PRAM uses continuity criteria, together with component failure rates and restoration times to calculate load point indices. System indices similar to those produced by HISRAM are also calculated. Varying degrees of detail for representing the protection system are available through three user-selected models. Both models were tested through application. The conclusions and recommendations of the entire project are included.

  5. Distributed-sensor-system decision analysis using team strategies

    NASA Astrophysics Data System (ADS)

    Choe, Howard C.; Kazakos, Demetrios

    1992-11-01

    A distributed (or decentralized) multiple sensor system is considered under binary hypothesis environments. The system is deployed with a host sensor (HS) and multiple slave sensors (SSs). All sensors have their own independent decision makers which are capable of declaring local decisions based solely on their own observation of the environment. The communication between the HS and the SSs is conditional upon the HS's command. Each communication that takes place involves a communication cost which plays an important role in the approaches taken in this study. The conditional communication with the cost initiates the team strategy in making the final decisions at the HS. The objectives are not only to apply the team strategy method in the decision making process, but also to minimize the expected system cost (or the probability of error in making decisions) by optimizing thresholds in the HS> The analytical expression of the expected system cost (C) is numerically evaluated for Gaussian statistics over threshold locations in the HS to find an optimal threshold location for a given communication cost. The computer simulations of various sensor systems for Gaussian observations are also performed in order to understand the behavior of each system with respect to correct detections, false, alarms, and target misses.

  6. Distributed sensor system decision analysis using team strategies

    NASA Astrophysics Data System (ADS)

    Choe, Howard C.; Kazakos, Dimitri

    1991-07-01

    A distributed (or decentralized) multiple sensor system is considered under binary hypothesis environments. The system is deployed with a host sensor and multiple slave sensors. All sensors have their own independent decision makers (DM) which are capable of declaring local decisions based only on their own observation of the environment. The communication between the host sensor (HS) and the slave sensors (SS) is conditional upon the host sensor's command. Each communication that takes place involves a communication cost which plays an important role in approaches taken in this study. The conditional communication with cost initiates the team strategy in making the final decisions at the host sensor. The objectives are not only to apply the team strategy method in the decision making process, but also to minimize the expected system cost (or the probability or error in making decisions) by optimizing thresholds in the host sensor. The analytical expression of the expected system cost is numerically evaluated for Gaussian statistics over threshold locations in the host sensor to find an optimal threshold location for a given communication cost. The computer simulations of various sensor systems for Gaussian observations are also performed to understand the behavior of each system with respect to correct detections, false alarms, and target misses.

  7. Analysis of Length Distribution of Drainage Basin Perimeter

    NASA Astrophysics Data System (ADS)

    Werner, Christian

    1982-08-01

    To establish a theoretical base for the study of the length distribution of basin perimeters, the paper introduces a descriptive model of the topology of interlocking channel and ridge networks. Assuming topological randomness within and between both, the expected number of links of basin perimeters is derived; for large basin magnitudes n, it approximates a square root function in n. Observed link numbers of perimeters deviate significantly, showing a 0.69 regression exponent for their growth rate relative to the basin magnitude rather than the expected value of 0.5. The spatial constraint of possible perimeter/(area;)½ proportions as defined by the circle is translated into a corresponding topological constraint but fails to provide a sufficient explanation. The paper then explores the possibility that the relatively large length of the perimeter reflects the basin elongation which, following Hack, might be linked to the length of the mainstream. Although basin perimeter, elongation, and mainstream length are highly correlated and the elongation axis is oriented to the outlet in two-thirds of the sample basins, the data indicate that the mainstream link number does not account for the basin elongation, nor does it account for the number of links of the basin perimeter.

  8. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  9. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Astrophysics Data System (ADS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-02-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  10. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study.

    PubMed

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2013-11-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin-Rammler-Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a

  11. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study

    PubMed Central

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2015-01-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a

  12. Correlation Spectroscopy of Minor Species: Signal Purification and Distribution Analysis

    SciTech Connect

    Laurence, T A; Kwon, Y; Yin, E; Hollars, C; Camarero, J A; Barsky, D

    2006-06-21

    We are performing experiments that use fluorescence resonance energy transfer (FRET) and fluorescence correlation spectroscopy (FCS) to monitor the movement of an individual donor-labeled sliding clamp protein molecule along acceptor-labeled DNA. In addition to the FRET signal sought from the sliding clamp-DNA complexes, the detection channel for FRET contains undesirable signal from free sliding clamp and free DNA. When multiple fluorescent species contribute to a correlation signal, it is difficult or impossible to distinguish between contributions from individual species. As a remedy, we introduce ''purified FCS'' (PFCS), which uses single molecule burst analysis to select a species of interest and extract the correlation signal for further analysis. We show that by expanding the correlation region around a burst, the correlated signal is retained and the functional forms of FCS fitting equations remain valid. We demonstrate the use of PFCS in experiments with DNA sliding clamps. We also introduce ''single molecule FCS'', which obtains diffusion time estimates for each burst using expanded correlation regions. By monitoring the detachment of weakly-bound 30-mer DNA oligomers from a single-stranded DNA plasmid, we show that single molecule FCS can distinguish between bursts from species that differ by a factor of 5 in diffusion constant.

  13. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  14. Independent Orbiter Assessment (IOA): Analysis of the Electrical Power Distribution and Control Subsystem, Volume 2

    NASA Technical Reports Server (NTRS)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

  15. Parametric distribution approach for flow availability in small hydro potential analysis

    NASA Astrophysics Data System (ADS)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  16. Adiyaman wind potential and statistical analysis, in Turkey

    NASA Astrophysics Data System (ADS)

    Sogukpinar, Haci; Bozkurt, Ismail

    2017-02-01

    In this study, wind potential of Adiyaman is analyzed statistically and installed wind capacity across Turkey is summarized. One-year experimental data are obtained for the district of Adiyaman. The data is taken from two major data station of Sincik and Kahta which determines the wind potential of Adiyaman. Measurements at 10 m height are used for statistical analysis. With the data obtained, monthly average wind speed are calculated and statistical analyzes are performed using the Weibull, Gamma and Log-normal distribution. Data received from the wind station Sincik represents windy part of Adiyaman so average wind speed is higher. Kahta represent windless part of Adiyaman and the average wind speed is lower in there. This study shows that the best fit to the Gamma distribution of measurements made on.

  17. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  18. Wavelet analysis of baryon acoustic structures in the galaxy distribution

    NASA Astrophysics Data System (ADS)

    Arnalte-Mur, P.; Labatie, A.; Clerc, N.; Martínez, V. J.; Starck, J.-L.; Lachièze-Rey, M.; Saar, E.; Paredes, S.

    2012-06-01

    Context. Baryon acoustic oscillations (BAO) are imprinted in the density field by acoustic waves travelling in the plasma of the early universe. Their fixed scale can be used as a standard ruler to study the geometry of the universe. Aims: The BAO have been previously detected using correlation functions and power spectra of the galaxy distribution. We present a new method to detect the real-space structures associated with BAO. These baryon acoustic structures are spherical shells of relatively small density contrast, surrounding high density central regions. Methods: We design a specific wavelet adapted to search for shells, and exploit the physics of the process by making use of two different mass tracers, introducing a specific statistic to detect the BAO features. We show the effect of the BAO signal in this new statistic when applied to the Λ - cold dark matter (ΛCDM) model, using an analytical approximation to the transfer function. We confirm the reliability and stability of our method by using cosmological N-body simulations from the MareNostrum Institut de Ciències de l'Espai (MICE). Results: We apply our method to the detection of BAO in a galaxy sample drawn from the Sloan Digital Sky Survey (SDSS). We use the "main" catalogue to trace the shells, and the luminous red galaxies (LRG) as tracers of the high density central regions. Using this new method, we detect, with a high significance, that the LRG in our sample are preferentially located close to the centres of shell-like structures in the density field, with characteristics similar to those expected from BAO. We show that stacking selected shells, we can find their characteristic density profile. Conclusions: We delineate a new feature of the cosmic web, the BAO shells. As these are real spatial structures, the BAO phenomenon can be studied in detail by examining those shells. Full Table 1 is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc

  19. Characterizing fibrosis in UUO mice model using multiparametric analysis of phasor distribution from FLIM images.

    PubMed

    Ranjit, Suman; Dvornikov, Alexander; Levi, Moshe; Furgeson, Seth; Gratton, Enrico

    2016-09-01

    Phasor approach to fluorescence lifetime microscopy is used to study development of fibrosis in the unilateral ureteral obstruction model (UUO) of kidney in mice. Traditional phasor analysis has been modified to create a multiparametric analysis scheme that splits the phasor points in four equidistance segments based on the height of peak of the phasor distribution and calculates six parameters including average phasor positions, the shape of each segment, the angle of the distribution and the number of points in each segment. These parameters are used to create a spectrum of twenty four points specific to the phasor distribution of each sample. Comparisons of spectra from diseased and healthy tissues result in quantitative separation and calculation of statistical parameters including AUC values, positive prediction values and sensitivity. This is a new method in the evolving field of analyzing phasor distribution of FLIM data and provides further insights. Additionally, the progression of fibrosis with time is detected using this multiparametric approach to phasor analysis.

  20. Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jianming; Liu, Lihua; Yu, Hua

    2015-12-01

    The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.

  1. Numerical analysis of atomic density distribution in arc driven negative ion sources

    SciTech Connect

    Yamamoto, T. Shibata, T.; Hatayama, A.; Kashiwagi, M.; Hanada, M.; Sawada, K.

    2014-02-15

    The purpose of this study is to calculate atomic (H{sup 0}) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H{sup 0} density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H{sup 0} production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H{sup 0} production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H{sup 0} production and the ionization are enhanced. The calculated H{sup 0} density distribution without the effect of the H{sup 0} transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H{sup 0} distribution.

  2. Progress in Using the Generalized Wigner Distribution in the Analysis of Terrace-Width Distributions of Vicinal Surfaces

    NASA Astrophysics Data System (ADS)

    Cohen, S. D.; Richards, Howard L.; Einstein, T. L.

    2000-03-01

    The so-called generalized Wigner distribution (GWD) may provide at least as good a description of terrace width distributions (TWDs) on vicinal surfaces as the standard Gaussian fit.(T.L. Einstein and O. Pierre-Louis, Surface Sci. 424), L299 (1999). It works well for weak elastic repulsion strengths A between steps (where the latter fails), as illustrated explicitly(S.D. Cohen, H.L. Richards, TLE, and M. Giesen, cond- mat/9911319.) for vicinal Pt(110).( K. Swamy, E. Bertel, and I. Vilfan, Surface Sci. 425), L369 (1999). Applications to vicinal copper surfaces confirms the general viability of the new analysis procedure.(M. Giesen and T.L. Einstein, submitted to Surface Sci.) For troublesome data, we can treat the GWD as a two-parameter fit that allows the terrace widths to be scaled by an optimal effective mean width.^3 With Monte Carlo simulations we show that for physical values of A, the GWD provides a better overall estimate than the Gaussian models. We quantify how a GWD approaches a Gaussian for large A and present a convenient but accurate new expression relating the variance of the TWD to A.^3 We also mention how discreteness of terrace widths impacts the standard continuum analysis.^3

  3. Statistical Scalability Analysis of Communication Operations in Distributed Applications

    SciTech Connect

    Vetter, J S; McCracken, M O

    2001-02-27

    Current trends in high performance computing suggest that users will soon have widespread access to clusters of multiprocessors with hundreds, if not thousands, of processors. This unprecedented degree of parallelism will undoubtedly expose scalability limitations in existing applications, where scalability is the ability of a parallel algorithm on a parallel architecture to effectively utilize an increasing number of processors. Users will need precise and automated techniques for detecting the cause of limited scalability. This paper addresses this dilemma. First, we argue that users face numerous challenges in understanding application scalability: managing substantial amounts of experiment data, extracting useful trends from this data, and reconciling performance information with their application's design. Second, we propose a solution to automate this data analysis problem by applying fundamental statistical techniques to scalability experiment data. Finally, we evaluate our operational prototype on several applications, and show that statistical techniques offer an effective strategy for assessing application scalability. In particular, we find that non-parametric correlation of the number of tasks to the ratio of the time for individual communication operations to overall communication time provides a reliable measure for identifying communication operations that scale poorly.

  4. Biomechanical analysis of force distribution in human finger extensor mechanisms.

    PubMed

    Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

    2014-01-01

    The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the "Principle of Minimum Total Potential Energy" is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo.

  5. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    NASA Astrophysics Data System (ADS)

    Singh, R.; Percivall, G.

    2009-12-01

    Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

  6. Hyperdimensional Analysis of Amino Acid Pair Distributions in Proteins

    PubMed Central

    Henriksen, Svend B.; Arnason, Omar; Söring, Jón; Petersen, Steffen B.

    2011-01-01

    Our manuscript presents a novel approach to protein structure analyses. We have organized an 8-dimensional data cube with protein 3D-structural information from 8706 high-resolution non-redundant protein-chains with the aim of identifying packing rules at the amino acid pair level. The cube contains information about amino acid type, solvent accessibility, spatial and sequence distance, secondary structure and sequence length. We are able to pose structural queries to the data cube using program ProPack. The response is a 1, 2 or 3D graph. Whereas the response is of a statistical nature, the user can obtain an instant list of all PDB-structures where such pair is found. The user may select a particular structure, which is displayed highlighting the pair in question. The user may pose millions of different queries and for each one he will receive the answer in a few seconds. In order to demonstrate the capabilities of the data cube as well as the programs, we have selected well known structural features, disulphide bridges and salt bridges, where we illustrate how the queries are posed, and how answers are given. Motifs involving cysteines such as disulphide bridges, zinc-fingers and iron-sulfur clusters are clearly identified and differentiated. ProPack also reveals that whereas pairs of Lys residues virtually never appear in close spatial proximity, pairs of Arg are abundant and appear at close spatial distance, contrasting the belief that electrostatic repulsion would prevent this juxtaposition and that Arg-Lys is perceived as a conservative mutation. The presented programs can find and visualize novel packing preferences in proteins structures allowing the user to unravel correlations between pairs of amino acids. The new tools allow the user to view statistical information and visualize instantly the structures that underpin the statistical information, which is far from trivial with most other SW tools for protein structure analysis. PMID:22174733

  7. A network analysis of food flows within the United States of America.

    PubMed

    Lin, Xiaowen; Dang, Qian; Konar, Megan

    2014-05-20

    The world food system is globalized and interconnected, in which trade plays an increasingly important role in facilitating food availability. We present a novel application of network analysis to domestic food flows within the USA, a country with global importance as a major agricultural producer and trade power. We find normal node degree distributions and Weibull node strength and betweenness centrality distributions. An unassortative network structure with high clustering coefficients exists. These network properties indicate that the USA food flow network is highly social and well-mixed. However, a power law relationship between node betweenness centrality and node degree indicates potential network vulnerability to the disturbance of key nodes. We perform an equality analysis which serves as a benchmark for global food trade, where the Gini coefficient = 0.579, Lorenz asymmetry coefficient = 0.966, and Hoover index = 0.442. These findings shed insight into trade network scaling and proxy free trade and equitable network architectures.

  8. Clinical Application of Spatiotemporal Distributed Source Analysis in Presurgical Evaluation of Epilepsy

    PubMed Central

    Tanaka, Naoaki; Stufflebeam, Steven M.

    2014-01-01

    Magnetoencephalography (MEG), which acquires neuromagnetic fields in the brain, is a useful diagnostic tool in presurgical evaluation of epilepsy. Previous studies have shown that MEG affects the planning intracranial electroencephalography placement and correlates with surgical outcomes by using a single dipole model. Spatiotemporal source analysis using distributed source models is an advanced method for analyzing MEG, and has been recently introduced for analyzing epileptic spikes. It has advantages over the conventional single dipole analysis for obtaining accurate sources and understanding the propagation of epileptic spikes. In this article, we review the source analysis methods, describe the techniques of the distributed source analysis, interpretation of source distribution maps, and discuss the benefits and feasibility of this method in evaluation of epilepsy. PMID:24574999

  9. Clinical application of spatiotemporal distributed source analysis in presurgical evaluation of epilepsy.

    PubMed

    Tanaka, Naoaki; Stufflebeam, Steven M

    2014-01-01

    Magnetoencephalography (MEG), which acquires neuromagnetic fields in the brain, is a useful diagnostic tool in presurgical evaluation of epilepsy. Previous studies have shown that MEG affects the planning intracranial electroencephalography placement and correlates with surgical outcomes by using a single dipole model. Spatiotemporal source analysis using distributed source models is an advanced method for analyzing MEG, and has been recently introduced for analyzing epileptic spikes. It has advantages over the conventional single dipole analysis for obtaining accurate sources and understanding the propagation of epileptic spikes. In this article, we review the source analysis methods, describe the techniques of the distributed source analysis, interpretation of source distribution maps, and discuss the benefits and feasibility of this method in evaluation of epilepsy.

  10. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    NASA Technical Reports Server (NTRS)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  11. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    NASA Astrophysics Data System (ADS)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  12. Graphical tests for the assumption of gamma and inverse Gaussian frailty distributions.

    PubMed

    Economou, P; Caroni, C

    2005-12-01

    The common choices of frailty distribution in lifetime data models include the Gamma and Inverse Gaussian distributions. We present diagnostic plots for these distributions when frailty operates in a proportional hazards framework. Firstly, we present plots based on the form of the unconditional survival function when the baseline hazard is assumed to be Weibull. Secondly, we base a plot on a closure property that applies for any baseline hazard, namely, that the frailty distribution among survivors at time t has the same form as the original distribution, with the same shape parameter but different scale parameter. We estimate the shape parameter at different values of t and examine whether it is constant, that is, whether plotted values form a straight line parallel to the time axis. We provide simulation results assuming Weibull baseline hazard and an example to illustrate the methods.

  13. Can Data Recognize Its Parent Distribution?

    SciTech Connect

    A.W.Marshall; J.C.Meza; and I. Olkin

    1999-05-01

    This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

  14. SCARE: A post-processor program to MSC/NASTRAN for the reliability analysis of structural ceramic components

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J. P.

    1985-01-01

    A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  15. Monte Carlo models and analysis of galactic disk gamma-ray burst distributions

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon

    1989-01-01

    Gamma-ray bursts are transient astronomical phenomena which have no quiescent counterparts in any region of the electromagnetic spectrum. Although temporal and spectral properties indicate that these events are likely energetic, their unknown spatial distribution complicates astrophysical interpretation. Monte Carlo samples of gamma-ray burst sources are created which belong to Galactic disk populations. Spatial analysis techniques are used to compare these samples to the observed distribution. From this, both quantitative and qualitative conclusions are drawn concerning allowed luminosity and spatial distributions of the actual sample. Although the Burst and Transient Source Experiment (BATSE) experiment on Gamma Ray Observatory (GRO) will significantly improve knowledge of the gamma-ray burst source spatial characteristics within only a few months of launch, the analysis techniques described herein will not be superceded. Rather, they may be used with BATSE results to obtain detailed information about both the luminosity and spatial distributions of the sources.

  16. BATSE analysis techniques for probing the GRB spatial and luminosity distributions

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Meegan, Charles A.

    1992-01-01

    The Burst And Transient Source Experiment (BATSE) has measured homogeneity and isotropy parameters from an increasingly large sample of observed gamma-ray bursts (GRBs), while also maintaining a summary of the way in which the sky has been sampled. Measurement of both of these are necessary for any study of the BATSE data statistically, as they take into account the most serious observational selection effects known in the study of GRBs: beam-smearing and inhomogeneous, anisotropic sky sampling. Knowledge of these effects is important to analysis of GRB angular and intensity distributions. In addition to determining that the bursts are local, it is hoped that analysis of such distributions will allow boundaries to be placed on the true GRB spatial distribution and luminosity function. The technique for studying GRB spatial and luminosity distributions is direct. Results of BATSE analyses are compared to Monte Carlo models parameterized by a variety of spatial and luminosity characteristics.

  17. Comparison of photon correlation spectroscopy with photosedimentation analysis for the determination of aqueous colloid size distributions

    USGS Publications Warehouse

    Rees, T.F.

    1990-01-01

    Photon correlation spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS. -from Author

  18. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    SciTech Connect

    Weitzel, E.; Hoeschele, E.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  19. Mean-square-displacement distribution in crystals and glasses: An analysis of the intrabasin dynamics.

    PubMed

    Flores-Ruiz, Hugo M; Naumis, Gerardo G

    2012-04-01

    In the energy landscape picture, the dynamics of glasses and crystals is usually decomposed into two separate contributions: interbasin and intrabasin dynamics. The intrabasin dynamics depends partially on the quadratic displacement distribution on a given metabasin. Here we show that such a distribution can be approximated by a Gamma function, with a mean that depends linearly on the temperature and on the inverse second moment of the density of vibrational states. The width of the distribution also depends on this last quantity, and thus the contribution of the boson peak in glasses is evident on the tail of the distribution function. It causes the distribution of the mean-square displacement to decay slower in glasses than in crystals. When a statistical analysis is performed under many energy basins, we obtain a Gaussian in which the width is regulated by the mean inverse second moment of the density of states. Simulations performed in binary glasses are in agreement with such a result.

  20. Robust Bayesian Analysis of Heavy-tailed Stochastic Volatility Models using Scale Mixtures of Normal Distributions

    PubMed Central

    Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.

    2009-01-01

    A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043

  1. A computational approach for the analysis of changes in polygon distributions

    NASA Astrophysics Data System (ADS)

    Sadahiro, Yukio; Umemura, Mitsuru

    This paper develops a computational method for analyzing changes in polygon distributions. Unmovable polygons that change discontinuously without explicit functional linkage information are discussed. Six types of primitive events are used to describe the change: 1) generation, 2) disappearance, 3) expansion, 4) shrinkage, 5) union, and 6) division. The change of polygon distributions is decomposed into a combination of these events. A computational procedure for deducing a set of events from polygon distributions of two times is proposed. The method is applied to the analysis of the spatial competition between the major and small chains of convenience stores in Tokyo, Japan. Some empirical findings are shown.

  2. Analysis and synthesis of distributed-lumped-active networks by digital computer

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

  3. Analysis of the 3D distribution of stacked self-assembled quantum dots by electron tomography

    PubMed Central

    2012-01-01

    The 3D distribution of self-assembled stacked quantum dots (QDs) is a key parameter to obtain the highest performance in a variety of optoelectronic devices. In this work, we have measured this distribution in 3D using a combined procedure of needle-shaped specimen preparation and electron tomography. We show that conventional 2D measurements of the distribution of QDs are not reliable, and only 3D analysis allows an accurate correlation between the growth design and the structural characteristics. PMID:23249477

  4. Analysis of temperature distribution during tension test of glass fiber reinforced plastic by fiber orientation variation.

    PubMed

    Kim, Jin-Woo; Kim, Hyoung-Seok; Lee, Dong-Gi

    2014-10-01

    In this paper, analysis of temperature distribution by fiber orientation variation under tension test was proposed through IR thermography camera. Lock-in method, which is one of technique in IR thermography camera to measure minute change in temperature, was utilized to monitor temperature distribution and change during crack propagation. Method to analyze of temperature distribution by fiber orientation variation under tension test of GFRP via IR thermography camera was suggested. At the maximum stress point, temperature was significantly increased. As shown previously, specimen with shorter fracture time showed abrupt increment of temperature at the maximum stress point. Specimen with longer fracture time displayed increment of temperature after the maximum stress point.

  5. Geographic distribution of suicide and railway suicide in Belgium, 2008-2013: a principal component analysis.

    PubMed

    Strale, Mathieu; Krysinska, Karolina; Overmeiren, Gaëtan Van; Andriessen, Karl

    2016-04-20

    This study investigated the geographic distribution of suicide and railway suicide in Belgium over 2008--2013 on local (i.e., district or arrondissement) level. There were differences in the regional distribution of suicide and railway suicides in Belgium over the study period. Principal component analysis identified three groups of correlations among population variables and socio-economic indicators, such as population density, unemployment, and age group distribution, on two components that helped explaining the variance of railway suicide at a local (arrondissement) level. This information is of particular importance to prevent suicides in high-risk areas on the Belgian railway network.

  6. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  7. Theoretical Analysis of Orientation Distribution Function Reconstruction of Textured Polycrystal by Parametric X-rays

    NASA Astrophysics Data System (ADS)

    Lobach, I.; Benediktovitch, A.

    2016-07-01

    The possibility of quantitative texture analysis by means of parametric x-ray radiation (PXR) from relativistic electrons with Lorentz factor γ > 50MeV in a polycrystal is considered theoretically. In the case of rather smooth orientation distribution function (ODF) and large detector (θD >> 1/γ) the universal relation between ODF and intensity distribution is presented. It is shown that if ODF is independent on one from Euler angles, then the texture is fully determined by angular intensity distribution. Application of the method to the simulated data shows the stability of the proposed algorithm.

  8. powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions

    PubMed Central

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

  9. An information flow analysis of a distributed information system for space medical support.

    PubMed

    Zhang, Tao; Aranzamendez, Gina; Rinkus, Susan; Gong, Yang; Rukab, Jamie; Johnson-Throop, Kathy A; Malin, JaneT; Zhang, Jiajie

    2004-01-01

    In this study, we applied the methodology grounded in human-centered distributed cognition principles to the information flow analysis of a highly intensive, distributed and complex environment--the Biomedical Engineer (BME) console system at NASA Johnson Space Center. This system contains disparate human and artificial agents and artifacts. Users and tasks of this system were analyzed. An ethnographic study and a detailed communication pattern analysis were conducted to gain deeper insight and better understanding of the information flow patterns and the organizational memory of the current BME console system. From this study, we identified some major problems and offered recommendations to improve the efficiency and effectiveness of this system. We believe that this analysis methodology can be used in other distributed information systems, such as a healthcare environment.

  10. Modal analysis of a cantilever beam by use of Brillouin based distributed dynamic strain measurements

    NASA Astrophysics Data System (ADS)

    Minardo, Aldo; Coscetta, Agnese; Pirozzi, Salvatore; Bernini, Romeo; Zeni, Luigi

    2012-12-01

    In this work we report an experimental modal analysis of a cantilever beam, carried out by use of a Brillouin optical time-domain analysis (BOTDA) setup operated at a fixed pump-probe frequency shift. The employed technique permitted us to carry out distributed strain measurements along the vibrating beam at a maximum acquisition rate of 108 Hz. The mode shapes of the first three bending modes (1.7, 10.8, 21.6 Hz) were measured for the structure under test. The good agreement between the experimental and numerical results based on a finite-element method (FEM) analysis demonstrates that Brillouin based distributed sensors are well suited to perform the modal analysis of a vibrating structure. This type of analysis may be useful for applications in structural health monitoring where changes in mode shapes are used as indicators of the damage to the structure.

  11. A Weibull model to describe antimicrobial kinetics of oregano and lemongrass essential oils against Salmonella Enteritidis in ground beef during refrigerated storage.

    PubMed

    de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

    2013-03-01

    The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life.

  12. Sensitivity Analysis of CLIMEX Parameters in Modelling Potential Distribution of Lantana camara L.

    PubMed Central

    Taylor, Subhashni; Kumar, Lalit

    2012-01-01

    A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling. PMID:22815881

  13. Sensitivity analysis of CLIMEX parameters in modelling potential distribution of Lantana camara L.

    PubMed

    Taylor, Subhashni; Kumar, Lalit

    2012-01-01

    A process-based niche model of L. camara L. (lantana), a highly invasive shrub species, was developed to estimate its potential distribution using CLIMEX. Model development was carried out using its native and invasive distribution and validation was carried out with the extensive Australian distribution. A good fit was observed, with 86.7% of herbarium specimens collected in Australia occurring within the suitable and highly suitable categories. A sensitivity analysis was conducted to identify the model parameters that had the most influence on lantana distribution. The changes in suitability were assessed by mapping the regions where the distribution changed with each parameter alteration. This allowed an assessment of where, within Australia, the modification of each parameter was having the most impact, particularly in terms of the suitable and highly suitable locations. The sensitivity of various parameters was also evaluated by calculating the changes in area within the suitable and highly suitable categories. The limiting low temperature (DV0), limiting high temperature (DV3) and limiting low soil moisture (SM0) showed highest sensitivity to change. The other model parameters were relatively insensitive to change. Highly sensitive parameters require extensive research and data collection to be fitted accurately in species distribution models. The results from this study can inform more cost effective development of species distribution models for lantana. Such models form an integral part of the management of invasive species and the results can be used to streamline data collection requirements for potential distribution modelling.

  14. Bayesian analysis for nonlinear mixed-effects models under heavy-tailed distributions.

    PubMed

    De la Cruz, Rolando

    2014-01-01

    A common assumption in nonlinear mixed-effects models is the normality of both random effects and within-subject errors. However, such assumptions make inferences vulnerable to the presence of outliers. More flexible distributions are therefore necessary for modeling both sources of variability in this class of models. In the present paper, I consider an extension of the nonlinear mixed-effects models in which random effects and within-subject errors are assumed to be distributed according to a rich class of parametric models that are often used for robust inference. The class of distributions I consider is the scale mixture of multivariate normal distributions that consist of a wide range of symmetric and continuous distributions. This class includes heavy-tailed multivariate distributions, such as the Student's t and slash and contaminated normal. With the scale mixture of multivariate normal distributions, robustification is achieved from the tail behavior of the different distributions. A Bayesian framework is adopted, and MCMC is used to carry out posterior analysis. Model comparison using different criteria was considered. The procedures are illustrated using a real dataset from a pharmacokinetic study. I contrast results from the normal and robust models and show how the implementation can be used to detect outliers.

  15. The Analysis of the Strength, Distribution and Direction for the EEG Phase Synchronization by Musical Stimulus

    NASA Astrophysics Data System (ADS)

    Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko

    In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.

  16. Reliability and availability analysis of a 10 kW@20 K helium refrigerator

    NASA Astrophysics Data System (ADS)

    Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.

    2017-02-01

    A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.

  17. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  18. Consideration of tip speed limitations in preliminary analysis of minimum COE wind turbines

    NASA Astrophysics Data System (ADS)

    Cuerva-Tejero, A.; Yeow, T. S.; Lopez-Garcia, O.; Gallego-Castillo, C.

    2014-12-01

    A relation between Cost Of Energy, COE, maximum allowed tip speed, and rated wind speed, is obtained for wind turbines with a given goal rated power. The wind regime is characterised by the corresponding parameters of the probability density function of wind speed. The non-dimensional characteristics of the rotor: number of blades, the blade radial distributions of local solidity, twist, angle, and airfoil type, play the role of parameters in the mentioned relation. The COE is estimated using a cost model commonly used by the designers. This cost model requires basic design data such as the rotor radius and the ratio between the hub height and the rotor radius. Certain design options, DO, related to the technology of the power plant, tower and blades are also required as inputs. The function obtained for the COE can be explored to find those values of rotor radius that give rise to minimum cost of energy for a given wind regime as the tip speed limitation changes. The analysis reveals that iso-COE lines evolve parallel to iso-radius lines for large values of limit tip speed but that this is not the case for small values of the tip speed limits. It is concluded that., as the tip speed limit decreases, the optimum decision for keeping minimum COE values can be: a) reducing the rotor radius for places with high weibull scale parameter or b) increasing the rotor radius for places with low weibull scale parameter.

  19. Strategic Sequencing for State Distributed PV Policies: A Quantitative Analysis of Policy Impacts and Interactions

    SciTech Connect

    Doris, E.; Krasko, V.A.

    2012-10-01

    State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

  20. Scaling Analysis of Time Distribution between Successive Earthquakes in Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Marekova, Elisaveta

    2016-08-01

    The earthquake inter-event time distribution is studied, using catalogs for different recent aftershock sequences. For aftershock sequences following the Modified Omori's Formula (MOF) it seems clear that the inter-event distribution is a power law. The parameters of this law are defined and they prove to be higher than the calculated value (2-1/ p). Based on the analysis of the catalogs, it is determined that the probability densities of the inter-event time distribution collapse into a single master curve when the data is rescaled with instantaneous intensity, R( t; M th ), defined by MOF. The curve is approximated by a gamma distribution. The collapse of the data provides a clear view of aftershock-occurrence self-similarity.

  1. EMD-WVD time-frequency distribution for analysis of multi-component signals

    NASA Astrophysics Data System (ADS)

    Chai, Yunzi; Zhang, Xudong

    2016-10-01

    Time-frequency distribution (TFD) is two-dimensional function that indicates the time-varying frequency content of one-dimensional signals. And The Wigner-Ville distribution (WVD) is an important and effective time-frequency analysis method. The WVD can efficiently show the characteristic of a mono-component signal. However, a major drawback is the extra cross-terms when multi-component signals are analyzed by WVD. In order to eliminating the cross-terms, we decompose signals into single frequency components - Intrinsic Mode Function (IMF) - by using the Empirical Mode decomposition (EMD) first, then use WVD to analyze each single IMF. In this paper, we define this new time-frequency distribution as EMD-WVD. And the experiment results show that the proposed time-frequency method can solve the cross-terms problem effectively and improve the accuracy of WVD time-frequency analysis.

  2. Analysis of energy disposal - Thermodynamic aspects of the entropy deficiency of a product state distribution

    NASA Technical Reports Server (NTRS)

    Levine, R. D.; Bernstein, R. B.

    1973-01-01

    A thermodynamic-like approach to the characterization of product state distributions is outlined. A moment analysis of the surprisal and the entropy deficiency is presented from a statistical mechanical viewpoint. The role of reactant state selection is discussed using the 'state function' property of the entropy.

  3. Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory

    SciTech Connect

    Sena, I.; Deppman, A.

    2013-03-25

    A systematic analysis of transverse momentum distribution of hadrons produced in ultrarelativistic p+p and A+A collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.

  4. An Analysis of Variance Approach for the Estimation of Response Time Distributions in Tests

    ERIC Educational Resources Information Center

    Attali, Yigal

    2010-01-01

    Generalizability theory and analysis of variance methods are employed, together with the concept of objective time pressure, to estimate response time distributions and the degree of time pressure in timed tests. By estimating response time variance components due to person, item, and their interaction, and fixed effects due to item types and…

  5. An investigation on the intra-sample distribution of cotton color by using image analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The colorimeter principle is widely used to measure cotton color. This method provides the sample’s color grade; but the result does not include information about the color distribution and any variation within the sample. We conducted an investigation that used image analysis method to study the ...

  6. Residence Time Distribution Measurement and Analysis of Pilot-Scale Pretreatment Reactors for Biofuels Production: Preprint

    SciTech Connect

    Sievers, D.; Kuhn, E.; Tucker, M.; Stickel, J.; Wolfrum, E.

    2013-06-01

    Measurement and analysis of residence time distribution (RTD) data is the focus of this study where data collection methods were developed specifically for the pretreatment reactor environment. Augmented physical sampling and automated online detection methods were developed and applied. Both the measurement techniques themselves and the produced RTD data are presented and discussed.

  7. The Robustness of 2SLS Estimation of a Non-normally Distributed Confirmatory Factor Analysis Model.

    ERIC Educational Resources Information Center

    Brown, R. L.

    1990-01-01

    A Monte Carlo study was conducted to assess the robustness of the limited information two-stage least squares (2SLS) estimation procedure on a confirmatory factor analysis model with nonnormal distributions. Full information maximum likelihood methods were used for comparison. One hundred model replications were used to generate data. (TJH)

  8. Strength distributions of adhesive bonded and adhesive/rivet combined joints

    NASA Astrophysics Data System (ADS)

    Imanaka, Makoto; Haraga, Kosuke; Nishikawa, Tetsuya

    1992-11-01

    The tensile and shear strengths of adhesive and adhesive/rivet combined joints are statistically evaluated, and the probability of failure is calculated for these two types of joints. Attention is given to the effects of the adhesive/rivet combination on mean tensile shear strength and coefficient of variation. The adhesive joint's strength distribution was well approximated by Weibull or doubly-exponential distribution function; tensile shear strength is significantly improved by the combination with rivets.

  9. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  10. Age Dating Fluvial Sediment Storage Reservoirs to Construct Sediment Waiting Time Distributions

    NASA Astrophysics Data System (ADS)

    Skalak, K.; Pizzuto, J. E.; Benthem, A.; Karwan, D. L.; Mahan, S.

    2015-12-01

    Suspended sediment transport is an important geomorphic process that can often control the transport of nutrients and contaminants. The time a particle spends in storage remains a critical knowledge gap in understanding particle trajectories through landscapes. We dated floodplain deposits in South River, VA, using fallout radionuclides (Pb-210, Cs-137), optically stimulated luminescence (OSL), and radiocarbon dating to determine sediment ages and construct sediment waiting time distributions. We have a total of 14 age dates in two eroding banks. We combine these age dates with a well-constrained history of mercury concentrations on suspended sediment in the river from an industrial release. Ages from fallout radionuclides document sedimentation from the early 1900s to the present, and agree with the history of mercury contamination. OSL dates span approximately 200 to 17,000 years old. We performed a standard Weibull analysis of nonexceedance to construct a waiting time distribution of floodplain sediment for the South River. The mean waiting time for floodplain sediment is 2930 years, while the median is approximately 710 years. When the floodplain waiting time distribution is combined with the waiting time distribution for in-channel sediment storage (available from previous studies), the mean waiting time shifts to approximately 680 years, suggesting that quantifying sediment waiting times for both channel and floodplain storage is critical in advancing knowledge of particle trajectories through watersheds.

  11. Long-term mechanical life testing of polymeric post insulators for distribution and a comparison to porcelain

    SciTech Connect

    Cherney, E.A. )

    1988-07-01

    The paper presents the results and analyses of long-term cantilever strength tests on polymeric line post insulators. The time-to-failure data for static cantilever loads are represented by the Weibull distribution. The life distribution, obtained from the maximum likelihood estimates of the accelerated failure times, fits an exponential model. An extrapolation of the life distribution to normal loads provides an estimate of the strength rating and mechanical equivalence to porcelain line post insulators.

  12. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  13. Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo D.; Van Horn, John D.; Lozev, Kamen M.; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; MacKenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S.; Toga, Arthur W.

    2009-01-01

    The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications

  14. Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

    SciTech Connect

    Gaite, José

    2010-03-01

    We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions.

  15. Study of Solid State Drives performance in PROOF distributed analysis system

    NASA Astrophysics Data System (ADS)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  16. Do Insect Populations Die at Constant Rates as They Become Older? Contrasting Demographic Failure Kinetics with Respect to Temperature According to the Weibull Model.

    PubMed

    Damos, Petros; Soulopoulou, Polyxeni

    2015-01-01

    Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging) since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off), but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard rate model

  17. Do Insect Populations Die at Constant Rates as They Become Older? Contrasting Demographic Failure Kinetics with Respect to Temperature According to the Weibull Model

    PubMed Central

    Damos, Petros; Soulopoulou, Polyxeni

    2015-01-01

    Temperature implies contrasting biological causes of demographic aging in poikilotherms. In this work, we used the reliability theory to describe the consistency of mortality with age in moth populations and to show that differentiation in hazard rates is related to extrinsic environmental causes such as temperature. Moreover, experiments that manipulate extrinsic mortality were used to distinguish temperature-related death rates and the pertinence of the Weibull aging model. The Newton-Raphson optimization method was applied to calculate parameters for small samples of ages at death by estimating the maximum likelihoods surfaces using scored gradient vectors and the Hessian matrix. The study reveals for the first time that the Weibull function is able to describe contrasting biological causes of demographic aging for moth populations maintained at different temperature regimes. We demonstrate that at favourable conditions the insect death rate accelerates as age advances, in contrast to the extreme temperatures in which each individual drifts toward death in a linear fashion and has a constant chance of passing away. Moreover, slope of hazard rates shifts towards a constant initial rate which is a pattern demonstrated by systems which are not wearing out (e.g. non-aging) since the failure, or death, is a random event independent of time. This finding may appear surprising, because, traditionally, it was mostly thought as rule that in aging population force of mortality increases exponentially until all individuals have died. Moreover, in relation to other studies, we have not observed any typical decelerating aging patterns at late life (mortality leveling-off), but rather, accelerated hazard rates at optimum temperatures and a stabilized increase at the extremes.In most cases, the increase in aging-related mortality was simulated reasonably well according to the Weibull survivorship model that is applied. Moreover, semi log- probability hazard rate model

  18. Radial distribution function imaging by STEM diffraction: Phase mapping and analysis of heterogeneous nanostructured glasses.

    PubMed

    Mu, Xiaoke; Wang, Di; Feng, Tao; Kübel, Christian

    2016-09-01

    Characterizing heterogeneous nanostructured amorphous materials is a challenging topic, because of difficulty to solve disordered atomic arrangement in nanometer scale. We developed a new transmission electron microscopy (TEM) method to enable phase analysis and mapping of heterogeneous amorphous structures. That is to combine scanning TEM (STEM) diffraction mapping, radial distribution function (RDF) analysis, and hyperspectral analysis. This method was applied to an amorphous zirconium oxide and zirconium iron multilayer system, and showed extreme sensitivity to small atomic packing variations. This approach helps to understand local structure variations in glassy composite materials and provides new insights to correlate structure and properties of glasses.

  19. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  20. A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.

    PubMed

    Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W

    2005-01-01

    We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.

  1. Analysis of variance of communication latencies in anesthesia: comparing means of multiple log-normal distributions.

    PubMed

    Ledolter, Johannes; Dexter, Franklin; Epstein, Richard H

    2011-10-01

    Anesthesiologists rely on communication over periods of minutes. The analysis of latencies between when messages are sent and responses obtained is an essential component of practical and regulatory assessment of clinical and managerial decision-support systems. Latency data including times for anesthesia providers to respond to messages have moderate (> n = 20) sample sizes, large coefficients of variation (e.g., 0.60 to 2.50), and heterogeneous coefficients of variation among groups. Highly inaccurate results are obtained both by performing analysis of variance (ANOVA) in the time scale or by performing it in the log scale and then taking the exponential of the result. To overcome these difficulties, one can perform calculation of P values and confidence intervals for mean latencies based on log-normal distributions using generalized pivotal methods. In addition, fixed-effects 2-way ANOVAs can be extended to the comparison of means of log-normal distributions. Pivotal inference does not assume that the coefficients of variation of the studied log-normal distributions are the same, and can be used to assess the proportional effects of 2 factors and their interaction. Latency data can also include a human behavioral component (e.g., complete other activity first), resulting in a bimodal distribution in the log-domain (i.e., a mixture of distributions). An ANOVA can be performed on a homogeneous segment of the data, followed by a single group analysis applied to all or portions of the data using a robust method, insensitive to the probability distribution.

  2. Characterizing fibrosis in UUO mice model using multiparametric analysis of phasor distribution from FLIM images

    PubMed Central

    Ranjit, Suman; Dvornikov, Alexander; Levi, Moshe; Furgeson, Seth; Gratton, Enrico

    2016-01-01

    Phasor approach to fluorescence lifetime microscopy is used to study development of fibrosis in the unilateral ureteral obstruction model (UUO) of kidney in mice. Traditional phasor analysis has been modified to create a multiparametric analysis scheme that splits the phasor points in four equidistance segments based on the height of peak of the phasor distribution and calculates six parameters including average phasor positions, the shape of each segment, the angle of the distribution and the number of points in each segment. These parameters are used to create a spectrum of twenty four points specific to the phasor distribution of each sample. Comparisons of spectra from diseased and healthy tissues result in quantitative separation and calculation of statistical parameters including AUC values, positive prediction values and sensitivity. This is a new method in the evolving field of analyzing phasor distribution of FLIM data and provides further insights. Additionally, the progression of fibrosis with time is detected using this multiparametric approach to phasor analysis. PMID:27699117

  3. Phosphorescence lifetime analysis with a quadratic programming algorithm for determining quencher distributions in heterogeneous systems.

    PubMed Central

    Vinogradov, S A; Wilson, D F

    1994-01-01

    A new method for analysis of phosphorescence lifetime distributions in heterogeneous systems has been developed. This method is based on decomposition of the data vector to a linearly independent set of exponentials and uses quadratic programming principles for x2 minimization. Solution of the resulting algorithm requires a finite number of calculations (it is not iterative) and is computationally fast and robust. The algorithm has been tested on various simulated decays and for analysis of phosphorescence measurements of experimental systems with descrete distributions of lifetimes. Critical analysis of the effect of signal-to-noise on the resolving capability of the algorithm is presented. This technique is recommended for resolution of the distributions of quencher concentration in heterogeneous samples, of which oxygen distributions in tissue is an important example. Phosphors of practical importance for biological oxygen measurements: Pd-meso-tetra (4-carboxyphenyl) porphyrin (PdTCPP) and Pd-meso-porphyrin (PdMP) have been used to provide experimental test of the algorithm. PMID:7858142

  4. Distribution and genetic analysis of TTV and TTMV major phylogenetic groups in French blood donors.

    PubMed

    Biagini, Philippe; Gallian, Pierre; Cantaloube, Jean-François; Attoui, Houssam; de Micco, Philippe; de Lamballerie, Xavier

    2006-02-01

    TTV and TTMV (recently assigned to the floating genus Anellovirus) infect human populations (including healthy individuals) at high prevalence (>80%). They display notably high levels of genetic diversity, but very little is known regarding the distribution of Anellovirus genetic groups in human populations. We analyzed the distribution of the major genetic groups of TTV and TTMV in healthy voluntary blood donors using group-independent and group-specific PCR amplifications systems, combined with sequence determination and phylogenetic analysis. Analysis of Anellovirus groups revealed a non-random pattern of group distribution with a predominant prevalence of TTV phylogenetic groups 1, 3, and 5, and of TTMV group 1. Multiple co-infections were observed. In addition, TTMV sequences exhibiting a high genetic divergence with reference sequences were identified. This study provided the first picture of the genetic distribution of the major phylogenetic groups of members of the genus Anellovirus in a cohort of French voluntary blood donors. Obtaining such data from a reference population comprising healthy individuals was an essential step that will allow the subsequent comparative analysis of cohorts including patients with well-characterized diseases, in order to identify any possible relationship between Anellovirus infection and human diseases.

  5. Measurement of bubble and pellet size distributions: past and current image analysis technology.

    PubMed

    Junker, Beth

    2006-08-01

    Measurements of bubble and pellet size distributions are useful for biochemical process optimizations. The accuracy, representation, and simplicity of these measurements improve when the measurement is performed on-line and in situ rather than off-line using a sample. Historical and currently available measurement systems for photographic methods are summarized for bubble and pellet (morphology) measurement applications. Applications to cells, mycelia, and pellets measurements have driven key technological developments that have been applied for bubble measurements. Measurement trade-offs exist to maximize accuracy, extend range, and attain reasonable cycle times. Mathematical characterization of distributions using standard statistical techniques is straightforward, facilitating data presentation and analysis. For the specific application of bubble size distributions, selected bioreactor operating parameters and physicochemical conditions alter distributions. Empirical relationships have been established in some cases where sufficient data have been collected. In addition, parameters and conditions with substantial effects on bubble size distributions were identified and their relative effects quantified. This information was used to guide required accuracy and precision targets for bubble size distribution measurements from newly developed novel on-line and in situ bubble measurement devices.

  6. Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

    SciTech Connect

    Weitzel, E.; Hoeschele, M.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  7. Analysis of large-scale distributed knowledge sources via autonomous cooperative graph mining

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Ortiz, Andres; Yan, Xifeng

    2014-05-01

    In this paper, we present a model for processing distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures, and communication bottlenecks. Our model exploits dependencies in the data to enable collaborative distributed querying in noisy data. The collaboration policy for computational resources is efficiently constructed from the belief propagation algorithm. To scale to large data sizes, we employ a combination of priority-based filtering, incremental processing, and communication compression techniques. Our solution achieved high accuracy of analysis results and orders of magnitude improvements in computation time compared to the centralized graph matching solution.

  8. Distributed hot-wire anemometry based on Brillouin optical time-domain analysis.

    PubMed

    Wylie, Michael T V; Brown, Anthony W; Colpitts, Bruce G

    2012-07-02

    A distributed hot-wire anemometer based on Brillouin optical time-domain analysis is presented. The anemometer is created by passing a current through a stainless steel tube fibre bundle and monitoring Brillouin frequency changes in the presence of airflow. A wind tunnel is used to provide laminar airflow while the device response is calibrated against theoretical models. The sensitivity equation for this anemometer is derived and discussed. Airspeeds from 0 m/s to 10 m/s are examined, and the results show that a Brillouin scattering based distributed hot-wire anemometer is feasible.

  9. A distributed fiber optic sensor system for dike monitoring using Brillouin optical frequency domain analysis

    NASA Astrophysics Data System (ADS)

    Nöther, Nils; Wosniok, Aleksander; Krebber, Katerina; Thiele, Elke

    2008-03-01

    We report on the development of a complete system for spatially resolved detection of critical soil displacement in river embankments. The system uses Brillouin frequency domain analysis (BOFDA) for distributed measurement of strain in silica optical fibers. Our development consists of the measurement unit, an adequate coating for the optical fibers and a technique to integrate the coated optical fibers into geotextiles as they are commonly used in dike construction. We present several laboratory and field tests that prove the capability of the system to detect areas of soil displacement as small as 2 meters. These are the first tests of truly distributed strain measurements on optical fibers embedded into geosynthetics.

  10. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    SciTech Connect

    Stewart, Emma; Kiliccote, Sila; McParland, Charles; Roberts, Ciaran

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  11. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    PubMed

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs.

  12. Evolution of the ATLAS PanDA Production and Distributed Analysis System

    NASA Astrophysics Data System (ADS)

    Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Walker, R.; Stradling, A.; Fine, V.; Potekhin, M.; Panitkin, S.; Compostella, G.

    2012-12-01

    The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDA has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.

  13. A Meta-Analysis of Distributed Leadership from 2002 to 2013: Theory Development, Empirical Evidence and Future Research Focus

    ERIC Educational Resources Information Center

    Tian, Meng; Risku, Mika; Collin, Kaija

    2016-01-01

    This article provides a meta-analysis of research conducted on distributed leadership from 2002 to 2013. It continues the review of distributed leadership commissioned by the English National College for School Leadership (NCSL) ("Distributed Leadership: A Desk Study," Bennett et al., 2003), which identified two gaps in the research…

  14. Analysis of electron energy distribution function in the Linac4 H{sup −} source

    SciTech Connect

    Mochizuki, S. Nishida, K.; Hatayama, A.; Mattei, S.; Lettry, J.

    2016-02-15

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H{sup −} negative ion production by reducing the gas pressure.

  15. Analysis of electron energy distribution function in the Linac4 H- source

    NASA Astrophysics Data System (ADS)

    Mochizuki, S.; Mattei, S.; Nishida, K.; Hatayama, A.; Lettry, J.

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H- negative ion production by reducing the gas pressure.

  16. Ultrasound shear wave simulation based on nonlinear wave propagation and Wigner-Ville Distribution analysis

    NASA Astrophysics Data System (ADS)

    Bidari, Pooya Sobhe; Alirezaie, Javad; Tavakkoli, Jahan

    2017-03-01

    This paper presents a method for modeling and simulation of shear wave generation from a nonlinear Acoustic Radiation Force Impulse (ARFI) that is considered as a distributed force applied at the focal region of a HIFU transducer radiating in nonlinear regime. The shear wave propagation is simulated by solving the Navier's equation from the distributed nonlinear ARFI as the source of the shear wave. Then, the Wigner-Ville Distribution (WVD) as a time-frequency analysis method is used to detect the shear wave at different local points in the region of interest. The WVD results in an estimation of the shear wave time of arrival, its mean frequency and local attenuation which can be utilized to estimate medium's shear modulus and shear viscosity using the Voigt model.

  17. Analysis of electron energy distribution function in the Linac4 H⁻ source.

    PubMed

    Mochizuki, S; Mattei, S; Nishida, K; Hatayama, A; Lettry, J

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H(-) negative ion production by reducing the gas pressure.

  18. Regional distribution of lung compliance by image analysis of computed tomograms.

    PubMed

    Perchiazzi, Gaetano; Rylander, Christian; Derosa, Savino; Pellegrini, Mariangela; Pitagora, Loredana; Polieri, Debora; Vena, Antonio; Tannoia, Angela; Fiore, Tommaso; Hedenstierna, Göran

    2014-09-15

    Computed tomography (CT) can yield quantitative information about volume distribution in the lung. By combining information provided by CT and respiratory mechanics, this study aims at quantifying regional lung compliance (CL) and its distribution and homogeneity in mechanically ventilated pigs. The animals underwent inspiratory hold maneuvers at 12 lung volumes with simultaneous CT exposure at two end-expiratory pressure levels and before and after acute lung injury (ALI) by oleic acid administration. CL and the sum of positive voxel compliances from CT were linearly correlated; negative compliance areas were found. A remarkably heterogeneous distribution of voxel compliance was found in the injured lungs. As the lung inflation increased, the homogeneity increased in healthy lungs but decreased in injured lungs. Image analysis brought novel findings regarding spatial homogeneity of compliance, which increases in ALI but not in healthy lungs by applying PEEP after a recruitment maneuver.

  19. Principal Effects of Axial Load on Moment-Distribution Analysis of Rigid Structures

    NASA Technical Reports Server (NTRS)

    James, Benjamin Wylie

    1935-01-01

    This thesis presents the method of moment distribution modified to include the effect of axial load upon the bending moments. This modification makes it possible to analyze accurately complex structures, such as rigid fuselage trusses, that heretofore had to be analyzed by approximate formulas and empirical rules. The method is simple enough to be practicable even for complex structures, and it gives a means of analysis for continuous beams that is simpler than the extended three-moment equation now in common use. When the effect of axial load is included, it is found that the basic principles of moment distribution remain unchanged, the only difference being that the factors used, instead of being constants for a given member, become functions of the axial load. Formulas have been developed for these factors, and curves plotted so that their applications requires no more work than moment distribution without axial load. Simple problems have been included to illustrate the use of the curves.

  20. Evolution of the ATLAS PanDA Production and Distributed Analysis System

    SciTech Connect

    Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Walker, R.; Stradling, A.; Fine, V.; Potekhin, M.; Panitkin, S.; Compostella, G.

    2012-12-13

    Evolution of the ATLAS PanDA Production and Distributed Analysis System T Maeno1,5, K De2, T Wenaus1, P Nilsson2, R Walker3, A Stradling2, V Fine1, M Potekhin1, S Panitkin1 and G Compostella4 Published under licence by IOP Publishing Ltd Journal of Physics: Conference Series, Volume 396, Part 3 Article PDF References Citations Metrics 101 Total downloads Cited by 8 articles Turn on MathJax Share this article Article information Abstract The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDA has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.

  1. Phenotype Clustering of Breast Epithelial Cells in Confocal Imagesbased on Nuclear Protein Distribution Analysis

    SciTech Connect

    Long, Fuhui; Peng, Hanchuan; Sudar, Damir; Levievre, Sophie A.; Knowles, David W.

    2006-09-05

    Background: The distribution of the chromatin-associatedproteins plays a key role in directing nuclear function. Previously, wedeveloped an image-based method to quantify the nuclear distributions ofproteins and showed that these distributions depended on the phenotype ofhuman mammary epithelial cells. Here we describe a method that creates ahierarchical tree of the given cell phenotypes and calculates thestatistical significance between them, based on the clustering analysisof nuclear protein distributions. Results: Nuclear distributions ofnuclear mitotic apparatus protein were previously obtained fornon-neoplastic S1 and malignant T4-2 human mammary epithelial cellscultured for up to 12 days. Cell phenotype was defined as S1 or T4-2 andthe number of days in cultured. A probabilistic ensemble approach wasused to define a set of consensus clusters from the results of multipletraditional cluster analysis techniques applied to the nucleardistribution data. Cluster histograms were constructed to show how cellsin any one phenotype were distributed across the consensus clusters.Grouping various phenotypes allowed us to build phenotype trees andcalculate the statistical difference between each group. The resultsshowed that non-neoplastic S1 cells could be distinguished from malignantT4-2 cells with 94.19 percent accuracy; that proliferating S1 cells couldbe distinguished from differentiated S1 cells with 92.86 percentaccuracy; and showed no significant difference between the variousphenotypes of T4-2 cells corresponding to increasing tumor sizes.Conclusion: This work presents a cluster analysis method that canidentify significant cell phenotypes, based on the nuclear distributionof specific proteins, with high accuracy.

  2. Development of a Web Service for Analysis in a Distributed Network

    PubMed Central

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  3. Validation results of the IAG Dancer project for distributed GPS analysis

    NASA Astrophysics Data System (ADS)

    Boomkamp, H.

    2012-12-01

    The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and

  4. A landscape analysis of cougar distribution and abundance in Montana, USA.

    PubMed

    Riley, S J; Malecki, R A

    2001-09-01

    Recent growth in the distribution and abundance of cougars (Puma concolor) throughout western North America has created opportunities, challenges, and problems for wildlife managers and raises questions about what factors affect cougar populations. We present an analysis of factors thought to affect cougar distribution and abundance across the broad geographical scales on which most population management decisions are made. Our objectives were to: (1) identify and evaluate landscape parameters that can be used to predict the capability of habitats to support cougars, and (2) evaluate factors that may account for the recent expansion in cougar numbers. Habitat values based on terrain ruggedness and forested cover explained 73% of the variation in a cougar abundance index. Indices of cougar abundance also were spatially and temporally correlated with ungulate abundance. An increase in the number and total biomass of ungulate prey species is hypothesized to account for recent increases in cougars. Cougar populations in Montana are coping with land development by humans when other components of habitat and prey populations are sufficient. Our analysis provides a better understanding of what may have influenced recent growth in cougar distribution and abundance in Montana and, when combined with insights about stakeholder acceptance capacity, offers a basis for cougar management at broad scales. Long-term conservation of cougars necessitates a better understanding of ecosystem functions that affect prey distribution and abundance, more accurate estimates of cougar populations, and management abilities to integrate these components with human values.

  5. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  6. Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants

    SciTech Connect

    Simion, G.P.; VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B.; Bulmahn, K.D.

    1993-06-01

    This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.

  7. Identifying Synonymy between SNOMED Clinical Terms of Varying Length Using Distributional Analysis of Electronic Health Records

    PubMed Central

    Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

    2013-01-01

    Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

  8. Estimation of flood design hydrographs using bivariate analysis (copula) and distributed hydrological modelling

    NASA Astrophysics Data System (ADS)

    Candela, A.; Brigandí, G.; Aronica, G. T.

    2014-01-01

    In this paper a procedure to derive Flood Design Hydrographs (FDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) using copulas, which describe and model the correlation between these two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model is presented. Rainfall-runoff modelling for estimating the hydrological response at the outlet of a watershed used a conceptual fully distributed procedure based on the soil conservation service - curve number method as excess rainfall model and a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the definition of a distributed unit hydrograph, has been performed, implementing a procedure using flow paths determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the return period of the FDH which give the probability of occurrence of a hydrograph flood peaks and flow volumes obtained through R-R modeling has been statistically treated via copulas. The shape of hydrograph has been generated on the basis of a modeled flood events, via cluster analysis. The procedure described above was applied to a case study of Imera catchment in Sicily, Italy. The methodology allows a reliable and estimation of the Design Flood Hydrograph and can be used for all the flood risk applications, i.e. evaluation, management, mitigation, etc.

  9. Nonlinear mixed-effects models for pharmacokinetic data analysis: assessment of the random-effects distribution.

    PubMed

    Drikvandi, Reza

    2017-02-13

    Nonlinear mixed-effects models are frequently used for pharmacokinetic data analysis, and they account for inter-subject variability in pharmacokinetic parameters by incorporating subject-specific random effects into the model. The random effects are often assumed to follow a (multivariate) normal distribution. However, many articles have shown that misspecifying the random-effects distribution can introduce bias in the estimates of parameters and affect inferences about the random effects themselves, such as estimation of the inter-subject variability. Because random effects are unobservable latent variables, it is difficult to assess their distribution. In a recent paper we developed a diagnostic tool based on the so-called gradient function to assess the random-effects distribution in mixed models. There we evaluated the gradient function for generalized liner mixed models and in the presence of a single random effect. However, assessing the random-effects distribution in nonlinear mixed-effects models is more challenging, especially when multiple random effects are present, and therefore the results from linear and generalized linear mixed models may not be valid for such nonlinear models. In this paper, we further investigate the gradient function and evaluate its performance for such nonlinear mixed-effects models which are common in pharmacokinetics and pharmacodynamics. We use simulations as well as real data from an intensive pharmacokinetic study to illustrate the proposed diagnostic tool.

  10. Mapping drug distribution in brain tissue using liquid extraction surface analysis mass spectrometry imaging.

    PubMed

    Swales, John G; Tucker, James W; Spreadborough, Michael J; Iverson, Suzanne L; Clench, Malcolm R; Webborn, Peter J H; Goodwin, Richard J A

    2015-10-06

    Liquid extraction surface analysis mass spectrometry (LESA-MS) is a surface sampling technique that incorporates liquid extraction from the surface of tissue sections with nanoelectrospray mass spectrometry. Traditional tissue analysis techniques usually require homogenization of the sample prior to analysis via high-performance liquid chromatography mass spectrometry (HPLC-MS), but an intrinsic weakness of this is a loss of all spatial information and the inability of the technique to distinguish between actual tissue penetration and response caused by residual blood contamination. LESA-MS, in contrast, has the ability to spatially resolve drug distributions and has historically been used to profile discrete spots on the surface of tissue sections. Here, we use the technique as a mass spectrometry imaging (MSI) tool, extracting points at 1 mm spatial resolution across tissue sections to build an image of xenobiotic and endogenous compound distribution to assess drug blood-brain barrier penetration into brain tissue. A selection of penetrant and "nonpenetrant" drugs were dosed to rats via oral and intravenous administration. Whole brains were snap-frozen at necropsy and were subsequently sectioned prior to analysis by matrix-assisted laser desorption ionization mass spectrometry imaging (MALDI-MSI) and LESA-MSI. MALDI-MSI, as expected, was shown to effectively map the distribution of brain penetrative compounds but lacked sufficient sensitivity when compounds were marginally penetrative. LESA-MSI was used to effectively map the distribution of these poorly penetrative compounds, highlighting its value as a complementary technique to MALDI-MSI. The technique also showed benefits when compared to traditional homogenization, particularly for drugs that were considered nonpenetrant by homogenization but were shown to have a measurable penetration using LESA-MSI.

  11. The role of Poisson's binomial distribution in the analysis of TEM images.

    PubMed

    Tejada, Arturo; den Dekker, Arnold J

    2011-11-01

    Frank's observation that a TEM bright-field image acquired under non-stationary conditions can be modeled by the time integral of the standard TEM image model [J. Frank, Nachweis von objektbewegungen im lichtoptis- chen diffraktogramm von elektronenmikroskopischen auf- nahmen, Optik 30 (2) (1969) 171-180.] is re-derived here using counting statistics based on Poisson's binomial distribution. The approach yields a statistical image model that is suitable for image analysis and simulation.

  12. A method of analysis of distributions of local electric fields in composites

    NASA Astrophysics Data System (ADS)

    Kolesnikov, V. I.; Yakovlev, V. B.; Bardushkin, V. V.; Lavrov, I. V.; Sychev, A. P.; Yakovleva, E. N.

    2016-03-01

    A method of prediction of distributions of local electric fields in composite media based on analysis of the tensor operators of the concentration of intensity and induction is proposed. Both general expressions and the relations for calculating these operators are obtained in various approximations. The analytical expressions are presented for the operators of the concentration of electric fields in various types of inhomogeneous structures obtained in the generalized singular approximation.

  13. Single-Event Correlation Analysis of Quantum Key Distribution with Single-Photon Sources

    NASA Astrophysics Data System (ADS)

    Shangli Dong,; Xiaobo Wang,; Guofeng Zhang,; Liantuan Xiao,; Suotang Jia,

    2010-04-01

    Multiple photons exist that allow efficient eavesdropping strategies that threaten the security of quantum key distribution. In this paper, we theoretically discuss the photon correlations between authorized partners in the case of practical single-photon sources including a multiple-photon background. To investigate the feasibility of intercept-resend attacks, the cross correlations and the maximum intercept-resend ratio caused by the background signal are determined using single-event correlation analysis based on single-event detection.

  14. Sensitivity analysis and parameter estimation for distributed hydrological modeling: potential of variational methods

    NASA Astrophysics Data System (ADS)

    Castaings, W.; Dartus, D.; Le Dimet, F.-X.; Saulnier, G.-M.

    2009-04-01

    Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised) with respect to model inputs. In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations) but didactic application case. It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run) and the singular value decomposition (SVD) of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation. For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers) is adopted. Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting.

  15. Comparative Analysis of HIV-1 and Murine Leukemia Virus Three-Dimensional Nuclear Distributions

    PubMed Central

    Quercioli, Valentina; Di Primio, Cristina; Casini, Antonio; Mulder, Lubbertus C. F.; Vranckx, Lenard S.; Borrenberghs, Doortje; Gijsbers, Rik; Debyser, Zeger

    2016-01-01

    Recent advances in fluorescence microscopy allow three-dimensional analysis of HIV-1 preintegration complexes in the nuclei of infected cells. To extend this investigation to gammaretroviruses, we engineered a fluorescent Moloney murine leukemia virus (MLV) system consisting of MLV-integrase fused to enhanced green fluorescent protein (MLV-IN-EGFP). A comparative analysis of lentiviral (HIV-1) and gammaretroviral (MLV) fluorescent complexes in the nuclei of infected cells revealed their different spatial distributions. This research tool has the potential to achieve new insight into the nuclear biology of these retroviruses. PMID:26962222

  16. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    NASA Astrophysics Data System (ADS)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  17. Single-phase power distribution system power flow and fault analysis

    NASA Technical Reports Server (NTRS)

    Halpin, S. M.; Grigsby, L. L.

    1992-01-01

    Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

  18. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    SciTech Connect

    Lagerlöf, Jakob H.; Kindblom, Jon; Bernhardt, Peter

    2014-09-15

    lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO{sub 2} (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO{sub 2} (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO{sub 2} (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. Conclusions: A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.

  19. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing

    PubMed Central

    da Rocha, Armando Freitas; Foz, Flávia Benevides; Pereira, Alfredo

    2015-01-01

    Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (si) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(ei) provided by each electrode of the 10/20 system about the identified si. H(ei) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources si. This analysis evidenced 4 different patterns of H(ei) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies. PMID:26713089

  20. Space station electrical power distribution analysis using a load flow approach

    NASA Technical Reports Server (NTRS)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  1. Computational analysis of the spatial distribution of mitotic spindle angles in mouse developing airway

    NASA Astrophysics Data System (ADS)

    Tang, Nan; Marshall, Wallace F.

    2013-02-01

    Investigating the spatial information of cellular processes in tissues during mouse embryo development is one of the major technical challenges in development biology. Many imaging methods are still limited to the volumes of tissue due to tissue opacity, light scattering and the availability of advanced imaging tools. For analyzing the mitotic spindle angle distribution in developing mouse airway epithelium, we determined spindle angles in mitotic epithelial cells on serial sections of whole airway of mouse embryonic lungs. We then developed a computational image analysis to obtain spindle angle distribution in three dimensional airway reconstructed from the data obtained from all serial sections. From this study, we were able to understand how mitotic spindle angles are distributed in a whole airway tube. This analysis provides a potentially fast, simple and inexpensive alternative method to quantitatively analyze cellular process at subcellular resolution. Furthermore, this analysis is not limited to the size of tissues, which allows to obtain three dimensional and high resolution information of cellular processes in cell populations deeper inside intact organs.

  2. Unraveling the distributed neural code of facial identity through spatiotemporal pattern analysis.

    PubMed

    Nestor, Adrian; Plaut, David C; Behrmann, Marlene

    2011-06-14

    Face individuation is one of the most impressive achievements of our visual system, and yet uncovering the neural mechanisms subserving this feat appears to elude traditional approaches to functional brain data analysis. The present study investigates the neural code of facial identity perception with the aim of ascertaining its distributed nature and informational basis. To this end, we use a sequence of multivariate pattern analyses applied to functional magnetic resonance imaging (fMRI) data. First, we combine information-based brain mapping and dynamic discrimination analysis to locate spatiotemporal patterns that support face classification at the individual level. This analysis reveals a network of fusiform and anterior temporal areas that carry information about facial identity and provides evidence that the fusiform face area responds with distinct patterns of activation to different face identities. Second, we assess the information structure of the network using recursive feature elimination. We find that diagnostic information is distributed evenly among anterior regions of the mapped network and that a right anterior region of the fusiform gyrus plays a central role within the information network mediating face individuation. These findings serve to map out and characterize a cortical system responsible for individuation. More generally, in the context of functionally defined networks, they provide an account of distributed processing grounded in information-based architectures.

  3. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  4. Mechanical response of silk crystalline units from force-distribution analysis.

    PubMed

    Xiao, Senbo; Stacklies, Wolfram; Cetinkaya, Murat; Markert, Bernd; Gräter, Frauke

    2009-05-20

    The outstanding mechanical toughness of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk proteins in the fiber. Here, we examine the robustness of these highly ordered beta-sheet structures by molecular dynamics simulations and finite element analysis. Structural parameters and stress-strain relationships of four different models, from spider and Bombyx mori silk peptides, in antiparallel and parallel arrangement, were determined and found to be in good agreement with x-ray diffraction data. Rupture forces exceed those of any previously examined globular protein many times over, with spider silk (poly-alanine) slightly outperforming Bombyx mori silk ((Gly-Ala)(n)). All-atom force distribution analysis reveals both intrasheet hydrogen-bonding and intersheet side-chain interactions to contribute to stability to similar extent. In combination with finite element analysis of simplified beta-sheet skeletons, we could ascribe the distinct force distribution pattern of the antiparallel and parallel silk crystalline units to the difference in hydrogen-bond geometry, featuring an in-line or zigzag arrangement, respectively. Hydrogen-bond strength was higher in antiparallel models, and ultimately resulted in higher stiffness of the crystal, compensating the effect of the mechanically disadvantageous in-line hydrogen-bond geometry. Atomistic and coarse-grained force distribution patterns can thus explain differences in mechanical response of silk crystals, opening up the road to predict full fiber mechanics.

  5. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  6. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  7. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  8. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  9. DARK MATTER DISTRIBUTION IN GALAXY GROUPS FROM COMBINED STRONG LENSING AND DYNAMICS ANALYSIS

    SciTech Connect

    Thanjavur, Karun; Crampton, David; Willis, Jon

    2010-05-10

    Using a combined analysis of strong lensing and galaxy dynamics, we characterize the mass distributions and the mass-to-light (M/L) ratios of galaxy groups, virialized structures in the mass range of few x 10{sup 14} M{sub sun}, which form an important transition regime in the hierarchical assembly of mass in {Lambda}CDM cosmology. Our goals are to not only map the mass distributions, but to also test whether the underlying density distribution at this mass scale is dark matter dominated, Navarro-Frenk-White (NFW) like as hypothesized by the standard cosmogony, or isothermal as observed in baryon-rich massive field galaxies. We present details of our lensing + galaxy dynamics formalism built around three representative density profiles, the dark matter dominant NFW and Hernquist distributions, compared with the softened isothermal sphere which matches baryon-rich galaxy scale objects. By testing the effects on the characteristics of these distributions due to variations in their parameters, we show that mass measurements in the core of the group (r/r{sub vir} {approx} 0.2), determined jointly from a lens model and from differential velocity dispersion estimates, may effectively distinguish between these density distributions. We apply our method to multi-object spectroscopy observations of two groups, SL2SJ143000+554648 and SL2SJ143139+553323, drawn from our catalog of galaxy group scale lenses discovered in CFHTLS-Wide imaging. With the lensing and dynamical mass estimates from our observations along with a maximum likelihood estimator built around our model, we estimate the concentration index characterizing each density distribution and the corresponding virial mass of each group. Our likelihood estimation indicates that both groups are dark matter dominant and rejects the isothermal distribution at >>3{sigma} level. For both groups, the estimated i-band M/L ratios of {approx}260 M{sub sun} L{sub sun} {sup -1} are similar to other published values for groups

  10. A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules

    SciTech Connect

    Stauch, Tim; Dreuw, Andreas

    2014-04-07

    The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.

  11. A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules.

    PubMed

    Stauch, Tim; Dreuw, Andreas

    2014-04-07

    The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.

  12. Nanomaterial size distribution analysis via liquid nebulization coupled with ion mobility spectrometry (LN-IMS).

    PubMed

    Jeon, Seongho; Oberreit, Derek R; Van Schooneveld, Gary; Hogan, Christopher J

    2016-02-21

    We apply liquid nebulization (LN) in series with ion mobility spectrometry (IMS, using a differential mobility analyzer coupled to a condensation particle counter) to measure the size distribution functions (the number concentration per unit log diameter) of gold nanospheres in the 5-30 nm range, 70 nm × 11.7 nm gold nanorods, and albumin proteins originally in aqueous suspensions. In prior studies, IMS measurements have only been carried out for colloidal nanoparticles in this size range using electrosprays for aerosolization, as traditional nebulizers produce supermicrometer droplets which leave residue particles from non-volatile species. Residue particles mask the size distribution of the particles of interest. Uniquely, the LN employed in this study uses both online dilution (with dilution factors of up to 10(4)) with ultra-high purity water and a ball-impactor to remove droplets larger than 500 nm in diameter. This combination enables hydrosol-to-aerosol conversion preserving the size and morphology of particles, and also enables higher non-volatile residue tolerance than electrospray based aerosolization. Through LN-IMS measurements we show that the size distribution functions of narrowly distributed but similarly sized particles can be distinguished from one another, which is not possible with Nanoparticle Tracking Analysis in the sub-30 nm size range. Through comparison to electron microscopy measurements, we find that the size distribution functions inferred via LN-IMS measurements correspond to the particle sizes coated by surfactants, i.e. as they persist in colloidal suspensions. Finally, we show that the gas phase particle concentrations inferred from IMS size distribution functions are functions of only of the liquid phase particle concentration, and are independent of particle size, shape, and chemical composition. Therefore LN-IMS enables characterization of the size, yield, and polydispersity of sub-30 nm particles.

  13. A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil

    PubMed Central

    Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo

    2014-01-01

    This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909

  14. Empirical analysis on the connection between power-law distributions and allometries for urban indicators

    NASA Astrophysics Data System (ADS)

    Alves, L. G. A.; Ribeiro, H. V.; Lenzi, E. K.; Mendes, R. S.

    2014-09-01

    We report on the existing connection between power-law distributions and allometries. As it was first reported in Gomez-Lievano et al. (2012) for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labor, illiteracy, income, sanitation and unemployment). Our analysis reveals that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residuals fluctuations surrounding the allometries are characterized by an almost constant variance and log-normal distributions.

  15. Analysis of crater distribution in mare units on the lunar far side

    NASA Technical Reports Server (NTRS)

    Walker, A. S.; El-Baz, F.

    1982-01-01

    Mare material is asymmetrically distributed on the moon. The earth-facing hemisphere, where the crust is believed to be 26 km thinner than on the farside, contains substantially more basaltic mare material. Using Lunar Topographic Orthophoto Maps, the thickness of the mare material in three farside craters, Aitken (0.59 km), Isaev (1.0 km), and Tsiolkovskiy (1.75 km) was calculated. Crater frequency distribution in five farside mare units (Aitken, Isaev, Lacus Solitudinis, Langemak, and Tsiolkovskiy) and one light plains unit (in Mendeleev) were also studied. Nearly 10,000 farside craters were counted. Analysis of the crater frequency on the light plains unit gives an age of 4.3 billion yr. Crater frequency distributions on the mare units indicate ages of 3.7 and 3.8 billion yr. suggesting that the units are distributed over a narrow time period of approximately 100 million yr. Returned lunar samples from nearside maria give dates as young as 3.1 billion yr. The results of this study suggest that mare basalt emplacement on the far side ceased before it did on the near side.

  16. Nonlinear Trimodal Regression Analysis of Radiodensitometric Distributions to Quantify Sarcopenic and Sequelae Muscle Degeneration

    PubMed Central

    Árnadóttir, Í.; Gíslason, M. K.; Carraro, U.

    2016-01-01

    Muscle degeneration has been consistently identified as an independent risk factor for high mortality in both aging populations and individuals suffering from neuromuscular pathology or injury. While there is much extant literature on its quantification and correlation to comorbidities, a quantitative gold standard for analyses in this regard remains undefined. Herein, we hypothesize that rigorously quantifying entire radiodensitometric distributions elicits more muscle quality information than average values reported in extant methods. This study reports the development and utility of a nonlinear trimodal regression analysis method utilized on radiodensitometric distributions of upper leg muscles from CT scans of a healthy young adult, a healthy elderly subject, and a spinal cord injury patient. The method was then employed with a THA cohort to assess pre- and postsurgical differences in their healthy and operative legs. Results from the initial representative models elicited high degrees of correlation to HU distributions, and regression parameters highlighted physiologically evident differences between subjects. Furthermore, results from the THA cohort echoed physiological justification and indicated significant improvements in muscle quality in both legs following surgery. Altogether, these results highlight the utility of novel parameters from entire HU distributions that could provide insight into the optimal quantification of muscle degeneration. PMID:28115982

  17. Further Progress Applying the Generalized Wigner Distribution to Analysis of Vicinal Surfaces

    NASA Astrophysics Data System (ADS)

    Einstein, T. L.; Richards, Howard L.; Cohen, S. D.

    2001-03-01

    Terrace width distributions (TWDs) can be well fit by the generalized Wigner distribution (GWD), generally better than by conventional Gaussians, and thus offers a convenient way to estimate the dimensionless elastic repulsion strength tildeA from σ^2, the TWD variance.(T.L. Einstein and O. Pierre-Louis, Surface Sci. 424), L299 (1999) The GWD σ^2 accurately reproduces values for the two exactly soluble cases at small tildeA and in the asymptotic limit. Taxing numerical simulations show that the GWD σ^2 interpolates well between these limits. Extensive applications have been made to experimental data, esp. on Cu.(M. Giesen and T.L. Einstein, Surface Sci. 449), 191 (2000) Recommended analysis procedures are catalogued.(H.L. Richards, S.D. Cohen, TLE, & M. Giesen, Surf Sci 453), 59 (2000) Extensions of the GWD for multistep distributions are tested, with good agreement for second-neighbor distributions, less good for third.(TLE, HLR, SDC, & OP-L, Proc ISSI-PDSC2000, cond-mat/0012xxxxx) Alternatively, step-step correlation functions, about which there is more theoretical information, should be measured.

  18. Mode-distribution analysis of quasielastic neutron scattering and application to liquid water

    NASA Astrophysics Data System (ADS)

    Kikuchi, Tatsuya; Nakajima, Kenji; Ohira-Kawamura, Seiko; Inamura, Yasuhiro; Yamamuro, Osamu; Kofu, Maiko; Kawakita, Yukinobu; Suzuya, Kentaro; Nakamura, Mitsutaka; Arai, Masatoshi

    2013-06-01

    A quasielastic neutron scattering (QENS) experiment is a particular technique that endeavors to define a relationship between time and space for the diffusion dynamics of atoms and molecules. However, in most cases, analyses of QENS data are model dependent, which may distort attempts to elucidate the actual diffusion dynamics. We have developed a method for processing QENS data without a specific model, wherein all modes can be described as combinations of the relaxations based on the exponential law. By this method, we can obtain a distribution function B(Q,Γ), which we call the mode-distribution function (MDF), to represent the number of relaxation modes and distributions of the relaxation times in the modes. The deduction of MDF is based on the maximum entropy method and is very versatile in QENS data analysis. To verify this method, reproducibility was checked against several analytical models, such as that with a mode of distributed relaxation time, that with two modes closely located, and that represented by the Kohlrausch-Williams-Watts function. We report the first application to experimental data of liquid water. In addition to the two known modes, the existence of a relaxation mode of water molecules with an intermediate time scale has been discovered. We propose that the fast mode might be assigned to an intermolecular motion and the intermediate motion might be assigned to a rotational motion of the water molecules instead of to the fast mode.

  19. Rural tourism spatial distribution based on multi-criteria decision analysis and GIS

    NASA Astrophysics Data System (ADS)

    Zhang, Hongxian; Yang, Qingsheng

    2008-10-01

    To study spatial distribution of rural tourism can provide scientific decision basis for developing rural economics. Traditional ways of tourism spatial distribution have some limitations in quantifying priority locations of tourism development on small units. They can only produce the overall tourism distribution locations and whether locations are suitable to tourism development simply while the tourism develop ranking with different decision objectives should be considered. This paper presents a way to find ranking of location of rural tourism development in spatial by integrating multi-criteria decision analysis (MCDA) and geography information system (GIS). In order to develop country economics with inconvenient transportation, undeveloped economy and better tourism resource, these locations should be firstly develop rural tourism. Based on this objective, the tourism develop priority utility of each town is calculated with MCDA and GIS. Towns which should be first develop rural tourism can be selected with higher tourism develop priority utility. The method is used to find ranking of location of rural tourism in Ningbo City successfully. The result shows that MCDA is an effective way for distribution rural tourism in spatial based on special decision objectives and rural tourism can promote economic development.

  20. Pore space analysis of NAPL distribution in sand-clay media

    USGS Publications Warehouse

    Matmon, D.; Hayden, N.J.

    2003-01-01

    This paper introduces a conceptual model of clays and non-aqueous phase liquids (NAPLs) at the pore scale that has been developed from a mathematical unit cell model, and direct micromodel observation and measurement of clay-containing porous media. The mathematical model uses a unit cell concept with uniform spherical grains for simulating the sand in the sand-clay matrix (???10% clay). Micromodels made with glass slides and including different clay-containing porous media were used to investigate the two clays (kaolinite and montmorillonite) and NAPL distribution within the pore space. The results were used to understand the distribution of NAPL advancing into initially saturated sand and sand-clay media, and provided a detailed analysis of the pore-scale geometry, pore size distribution, NAPL entry pressures, and the effect of clay on this geometry. Interesting NAPL saturation profiles were observed as a result of the complexity of the pore space geometry with the different packing angles and the presence of clays. The unit cell approach has applications for enhancing the mechanistic understanding and conceptualization, both visually and mathematically, of pore-scale processes such as NAPL and clay distribution. ?? 2003 Elsevier Science Ltd. All rights reserved.

  1. Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

    NASA Technical Reports Server (NTRS)

    Yoo, Paul

    2013-01-01

    Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

  2. Structure analysis and size distribution of particulate matter from candles and kerosene combustion in burning chamber

    NASA Astrophysics Data System (ADS)

    Baitimirova, M.; Osite, A.; Katkevics, J.; Viksna, A.

    2012-08-01

    Burning of candles generates particulate matter of fine dimensions that produces poor indoor air quality, so it may cause harmful impact on human health. In this study solid aerosol particles of burning of candles of different composition and kerosene combustion were collected in a closed laboratory system. Present work describes particulate matter collection for structure analysis and the relationship between source and size distribution of particulate matter. The formation mechanism of particulate matter and their tendency to agglomerate also are described. Particles obtained from kerosene combustion have normal size distribution. Whereas, particles generated from the burning of stearin candles have distribution shifted towards finer particle size range. If an additive of stearin to paraffin candle is used, particle size distribution is also observed in range of towards finer particles. A tendency to form agglomerates in a short time is observed in case of particles obtained from kerosene combustion, while in case of particles obtained from burning of candles of different composition such a tendency is not observed. Particles from candles and kerosene combustion are Aitken and accumulation mode particles

  3. Modeling Multi-Variate Gaussian Distributions and Analysis of Higgs Boson Couplings with the ATLAS Detector

    NASA Astrophysics Data System (ADS)

    Krohn, Olivia; Armbruster, Aaron; Gao, Yongsheng; Atlas Collaboration

    2017-01-01

    Software tools developed for the purpose of modeling CERN LHC pp collision data to aid in its interpretation are presented. Some measurements are not adequately described by a Gaussian distribution; thus an interpretation assuming Gaussian uncertainties will inevitably introduce bias, necessitating analytical tools to recreate and evaluate non-Gaussian features. One example is the measurements of Higgs boson production rates in different decay channels, and the interpretation of these measurements. The ratios of data to Standard Model expectations (μ) for five arbitrary signals were modeled by building five Poisson distributions with mixed signal contributions such that the measured values of μ are correlated. Algorithms were designed to recreate probability distribution functions of μ as multi-variate Gaussians, where the standard deviation (σ) and correlation coefficients (ρ) are parametrized. There was good success with modeling 1-D likelihood contours of μ, and the multi-dimensional distributions were well modeled within 1- σ but the model began to diverge after 2- σ due to unmerited assumptions in developing ρ. Future plans to improve the algorithms and develop a user-friendly analysis package will also be discussed. NSF International Research Experiences for Students

  4. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    PubMed

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well.

  5. Analysis of the temperature and stress distributions in ceramic window materials subjected to microwave heating

    SciTech Connect

    Ferber, M.K.; Kimrey, H.D.; Becher, P.F.

    1983-07-01

    The temperature and stress and distributions generated in ceramic materials currently employed in microwave gyrotron tube windows were determined for a variety of operating conditions. Both edge- and face-cooled windows of either polycrystalline BeO or polycrystalline Al/sub 2/O/sub 3/ were considered. The actual analysis involved three steps. First, a computer program was used to determine the electric field distribution within the window at a given power level and frequency (TE/sub 02/ wave propagation assumed). This program was capable of describing both the radial and axial dependence of the electric field. The effects of multiple internal reflections at the various dielectric interfaces were also accounted for. Secondly, the field distribution was used to derive an expression for the heat generated per unit volume per unit time within the window due to dieletric losses. A generalized heat conduction computer code was then used to compute the temperature distribution based on the heat generation function. Third, the stresses were determined from the temperature profiles using analytical expression or a finite-element computer program. Steady-state temperature and stress profiles were computed for the face-cooled and edge-cooled windows.

  6. Particle size distribution of brown and white rice during gastric digestion measured by image analysis.

    PubMed

    Bornhorst, Gail M; Kostlan, Kevin; Singh, R Paul

    2013-09-01

    The particle size distribution of foods during gastric digestion indicates the amount of physical breakdown that occurred due to the peristaltic movement of the stomach walls in addition to the breakdown that initially occurred during oral processing. The objective of this study was to present an image analysis technique that was rapid, simple, and could distinguish between food components (that is, rice kernel and bran layer in brown rice). The technique was used to quantify particle breakdown of brown and white rice during gastric digestion in growing pigs (used as a model for an adult human) over 480 min of digestion. The particle area distributions were fit to a Rosin-Rammler distribution function. Brown and white rice exhibited considerable breakdown as the number of particles per image decreased over time. The median particle area (x(50)) increased during digestion, suggesting a gastric sieving phenomenon, where small particles were emptied and larger particles were retained for additional breakdown. Brown rice breakdown was further quantified by an examination of the bran layer fragments and rice grain pieces. The percentage of total particle area composed of bran layer fragments was greater in the distal stomach than the proximal stomach in the first 120 min of digestion. The results of this study showed that image analysis may be used to quantify particle breakdown of a soft food product during gastric digestion, discriminate between different food components, and help to clarify the role of food structure and processing in food breakdown during gastric digestion.

  7. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  8. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China

    PubMed Central

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-01-01

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  9. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control subsystem, volume 1

    NASA Technical Reports Server (NTRS)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 1671 failure modes analyzed, 9 single failures were determined to result in loss of crew or vehicle. Three single failures unique to intact abort were determined to result in possible loss of the crew or vehicle. A possible loss of mission could result if any of 136 single failures occurred. Six of the criticality 1/1 failures are in two rotary and two pushbutton switches that control External Tank and Solid Rocket Booster separation. The other 6 criticality 1/1 failures are fuses, one each per Aft Power Control Assembly (APCA) 4, 5, and 6 and one each per Forward Power Control Assembly (FPCA) 1, 2, and 3, that supply power to certain Main Propulsion System (MPS) valves and Forward Reaction Control System (RCS) circuits.

  10. Non-negative factor analysis supporting the interpretation of elemental distribution images acquired by XRF

    NASA Astrophysics Data System (ADS)

    Alfeld, Matthias; Wahabzada, Mirwaes; Bauckhage, Christian; Kersting, Kristian; Wellenreuther, Gerd; Falkenberg, Gerald

    2014-04-01

    Stacks of elemental distribution images acquired by XRF can be difficult to interpret, if they contain high degrees of redundancy and components differing in their quantitative but not qualitative elemental composition. Factor analysis, mainly in the form of Principal Component Analysis (PCA), has been used to reduce the level of redundancy and highlight correlations. PCA, however, does not yield physically meaningful representations as they often contain negative values. This limitation can be overcome, by employing factor analysis that is restricted to non-negativity. In this paper we present the first application of the Python Matrix Factorization Module (pymf) on XRF data. This is done in a case study on the painting Saul and David from the studio of Rembrandt van Rijn. We show how the discrimination between two different Co containing compounds with minimum user intervention and a priori knowledge is supported by Non-Negative Matrix Factorization (NMF).

  11. Iterative Monte Carlo analysis of spin-dependent parton distributions

    DOE PAGES

    Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; ...

    2016-04-05

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d2 moment of the nucleon within a global PDF analysis.« less

  12. Iterative Monte Carlo analysis of spin-dependent parton distributions

    SciTech Connect

    Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; Ethier, Jacob J.; Accardi, Alberto

    2016-04-05

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFs and the d2 moment of the nucleon within a global PDF analysis.

  13. Quantitative analysis of virus-like particle size and distribution by field-flow fractionation.

    PubMed

    Chuan, Yap P; Fan, Yuan Y; Lua, Linda; Middelberg, Anton P J

    2008-04-15

    Asymmetric flow field-flow fractionation (AFFFF) coupled with multiple-angle light scattering (MALS) is a powerful technique showing potential for the analysis of pharmaceutically-relevant virus-like particles (VLPs). A lack of published methods, and concerns that membrane adsorption during sample fractionation may cause sample aggregation, have limited widespread acceptance. Here we report a reliable optimized method for VLP analysis using AFFFF-MALS, and benchmark it against dynamic light scattering (DLS) and transmission electron microscopy (TEM). By comparing chemically identical VLPs having very different quaternary structure, sourced from both bacteria and insect cells, we show that optimized AFFFF analysis does not cause significant aggregation, and that accurate size and distribution information can be obtained for heterogeneous samples in a way not possible with TEM and DLS. Optimized AFFFF thus provides a quantitative way to monitor batch consistency for new vaccine products, and rapidly provides unique information on the whole population of particles within a sample.

  14. Exergy Analysis of the Cryogenic Helium Distribution System for the Large Hadron Collider (lhc)

    NASA Astrophysics Data System (ADS)

    Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.

    2010-04-01

    The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

  15. Distributed and/or grid-oriented approach to BTeV data analysis

    SciTech Connect

    Joel N. Butler

    2002-12-23

    The BTeV collaboration will record approximately 2 petabytes of raw data per year. It plans to analyze this data using the distributed resources of the collaboration as well as dedicated resources, primarily residing in the very large BTeV trigger farm, and resources accessible through the developing world-wide data grid. The data analysis system is being designed from the very start with this approach in mind. In particular, we plan a fully disk-based data storage system with multiple copies of the data distributed across the collaboration to provide redundancy and to optimize access. We will also position ourself to take maximum advantage of shared systems, as well as dedicated systems, at our collaborating institutions.

  16. Analysis and modeling of information flow and distributed expertise in space-related operations.

    PubMed

    Caldwell, Barrett S

    2005-01-01

    Evolving space operations requirements and mission planning for long-duration expeditions require detailed examinations and evaluations of information flow dynamics, knowledge-sharing processes, and information technology use in distributed expert networks. This paper describes the work conducted with flight controllers in the Mission Control Center (MCC) of NASA's Johnson Space Center. This MCC work describes the behavior of experts in a distributed supervisory coordination framework, which extends supervisory control/command and control models of human task performance. Findings from this work are helping to develop analysis techniques, information architectures, and system simulation capabilities for knowledge sharing in an expert community. These findings are being applied to improve knowledge-sharing processes applied to a research program in advanced life support for long-duration space flight. Additional simulation work is being developed to create interoperating modules of information flow and novice/expert behavior patterns.

  17. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  18. Geographic Distribution of Leishmania Species in Ecuador Based on the Cytochrome B Gene Sequence Analysis.

    PubMed

    Kato, Hirotomo; Gomez, Eduardo A; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa

    2016-07-01

    A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas.

  19. Geographic Distribution of Leishmania Species in Ecuador Based on the Cytochrome B Gene Sequence Analysis

    PubMed Central

    Kato, Hirotomo; Gomez, Eduardo A.; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa

    2016-01-01

    A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas. PMID:27410039

  20. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert; Koch, Steven

    1992-01-01

    The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

  1. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert; Koch, Steven

    1993-01-01

    The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and designed to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three-dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

  2. MinMaxDM distributions for an analysis of the tensile strength of a unidirectional composite

    NASA Astrophysics Data System (ADS)

    Paramonov, Yu.; Andersons, J.; Kleinhofs, M.; Blumbergs, I.

    2010-09-01

    An analysis of the tensile strength of some fiber or fiber bundle specimens is presented. The specimens are modeled as chains of links consisting of longitudinal elements (LEs) with different cumulative distribution functions of strength, corresponding to the presence and absence of defects. Each link is considered as a system of parallel LEs a part of which can have defects. In the simplest case, the strength of defective elements is assumed equal to zero. The strength of a link is determined by the maximum average stress the link can sustain under a growing load. To calculate the stress, the randomized Daniels model or the theory of Markov chains is used. The strength of specimens is determined by the minimum strength of links. The concept of MinMaxDM family of distribution functions is introduced. A numerical example of processing experimental results for a monolayer of carbon bundles is presented.

  3. Reliability Analysis of Uniaxially Ground Brittle Materials

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.

    1995-01-01

    The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.

  4. A Grid-based solution for management and analysis of microarrays in distributed experiments

    PubMed Central

    Porro, Ivan; Torterolo, Livia; Corradi, Luca; Fato, Marco; Papadimitropoulos, Adam; Scaglione, Silvia; Schenone, Andrea; Viti, Federica

    2007-01-01

    Several systems have been presented in the last years in order to manage the complexity of large microarray experiments. Although good results have been achieved, most systems tend to lack in one or more fields. A Grid based approach may provide a shared, standardized and reliable solution for storage and analysis of biological data, in order to maximize the results of experimental efforts. A Grid framework has been therefore adopted due to the necessity of remotely accessing large amounts of distributed data as well as to scale computational performances for terabyte datasets. Two different biological studies have been planned in order to highlight the benefits that can emerge from our Grid based platform. The described environment relies on storage services and computational services provided by the gLite Grid middleware. The Grid environment is also able to exploit the added value of metadata in order to let users better classify and search experiments. A state-of-art Grid portal has been implemented in order to hide the complexity of framework from end users and to make them able to easily access available services and data. The functional architecture of the portal is described. As a first test of the system performances, a gene expression analysis has been performed on a dataset of Affymetrix GeneChip® Rat Expression Array RAE230A, from the ArrayExpress database. The sequence of analysis includes three steps: (i) group opening and image set uploading, (ii) normalization, and (iii) model based gene expression (based on PM/MM difference model). Two different Linux versions (sequential and parallel) of the dChip software have been developed to implement the analysis and have been tested on a cluster. From results, it emerges that the parallelization of the analysis process and the execution of parallel jobs on distributed computational resources actually improve the performances. Moreover, the Grid environment have been tested both against the possibility of

  5. Statistical Distribution of Inflation on Lava Flows: Analysis of Flow Surfaces on Earth and Mars

    NASA Technical Reports Server (NTRS)

    Glazel, L. S.; Anderson, S. W.; Stofan, E. R.; Baloga, S.

    2003-01-01

    The surface morphology of a lava flow results from processes that take place during the emplacement of the flow. Certain types of features, such as tumuli, lava rises and lava rise pits, are indicators of flow inflation or endogenous growth of a lava flow. Tumuli in particular have been identified as possible indicators of tube location, indicating that their distribution on the surface of a lava flow is a junction of the internal pathways of lava present during flow emplacement. However, the distribution of tumuli on lava flows has not been examined in a statistically thorough manner. In order to more rigorously examine the distribution of tumuli on a lava flow, we examined a discrete flow lobe with numerous lava rises and tumuli on the 1969 - 1974 Mauna Ulu flow at Kilauea, Hawaii. The lobe is located in the distal portion of the flow below Holei Pali, which is characterized by hummocky pahoehoe flows emplaced from tubes. We chose this flow due to its discrete nature allowing complete mapping of surface morphologies, well-defined boundaries, well-constrained emplacement parameters, and known flow thicknesses. In addition, tube locations for this Mauna Ulu flow were mapped by Holcomb (1976) during flow emplacement. We also examine the distribution of tumuli on the distal portion of the hummocky Thrainsskjoldur flow field provided by Rossi and Gudmundsson (1996). Analysis of the Mauna Ulu and Thrainsskjoldur flow lobes and the availability of high-resolution MOC images motivated us to look for possible tumuli-dominated flow lobes on the surface of Mars. We identified a MOC image of a lava flow south of Elysium Mons with features morphologically similar to tumuli. The flow is characterized by raised elliptical to circular mounds, some with axial cracks, that are similar in size to the tumuli measured on Earth. One potential avenue of determining whether they are tumuli is to look at the spatial distribution to see if any patterns similar to those of tumuli

  6. Using occlusal wear information and finite element analysis to investigate stress distributions in human molars.

    PubMed

    Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

    2011-09-01

    Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M(1)). The antagonistic crowns M(1) and P(2)-M(1) of two dried modern human skulls were scanned by μCT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M(1) and P(2)-M(1) was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M(1) in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth.

  7. Analysis of non-uniform current distribution effects in multistage cable-in-conduit conductors

    NASA Astrophysics Data System (ADS)

    Mitchell, N.

    1999-09-01

    An analysis procedure has been developed for non-uniform current effects in superconducting cables that contain many individual strands in limited electrical contact with each other. The procedure uses an approximation to the electrical diffusion equation to produce a lumped circuit model of the cable and a simplified zero dimensional heat balance equation to provide overall predictions of the cable stability to external disturbances. It is fast enough to be applied to the multistage cables (>1000 individual strands) that are required for large high field magnets. A model for the initial current distribution in such cables assumes that in steady state or slow ramp-up conditions the current in individual strands is limited to the critical value by the strand resistance. The distribution of current carrying strands is determined by the resistance distribution at the terminations, the cable transverse conductivity and variations in inductive coupling between individual strands. This model is applied to consider the effect of these parameters on the stability to short thermal disturbances. The particular case of the ITER Nb 3Sn CS model coil 13T, 40 kA cable-in-conduit conductor is analysed and it is shown that above a certain current level the cables can sometimes show a sharp drop in stability, qualitatively consistent with results observed on short sample tests. This stability drop is much more severe than that characterised by the conventional `well-cooled to ill-cooled' transition and represents the limit of steady state or slow ramp-up operation. The stability cut-off current is shown to be a function of the copper fraction in the cable, the uniformity of the current carrying strand distribution and the cable transverse conductance distribution. NbTi conductors with a similar configuration show the same behaviour.

  8. Characterizing the distribution of an endangered salmonid using environmental DNA analysis

    USGS Publications Warehouse

    Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.

    2015-01-01

    Determining species distributions accurately is crucial to developing conservation and management strategies for imperiled species, but a challenging task for small populations. We evaluated the efficacy of environmental DNA (eDNA) analysis for improving detection and thus potentially refining the known distribution of Chinook salmon (Oncorhynchus tshawytscha) in the Methow and Okanogan Subbasins of the Upper Columbia River, which span the border between Washington, USA and British Columbia, Canada. We developed an assay to target a 90 base pair sequence of Chinook DNA and used quantitative polymerase chain reaction (qPCR) to quantify the amount of Chinook eDNA in triplicate 1-L water samples collected at 48 stream locations in June and again in August 2012. The overall probability of detecting Chinook with our eDNA method in areas within the known distribution was 0.77 (±0.05 SE). Detection probability was lower in June (0.62, ±0.08 SE) during high flows and at the beginning of spring Chinook migration than during base flows in August (0.93, ±0.04 SE). In the Methow subbasin, mean eDNA concentration was higher in August compared to June, especially in smaller tributaries, probably resulting from the arrival of spring Chinook adults, reduced discharge, or both. Chinook eDNA concentrations did not appear to change in the Okanogan subbasin from June to August. Contrary to our expectations about downstream eDNA accumulation, Chinook eDNA did not decrease in concentration in upstream reaches (0–120 km). Further examination of factors influencing spatial distribution of eDNA in lotic systems may allow for greater inference of local population densities along stream networks or watersheds. These results demonstrate the potential effectiveness of eDNA detection methods for determining landscape-level distribution of anadromous salmonids in large river systems.

  9. Analysis and improvement of data-set level file distribution in Disk Pool Manager

    NASA Astrophysics Data System (ADS)

    Cadellin Skipsey, Samuel; Purdie, Stuart; Britton, David; Mitchell, Mark; Bhimji, Wahid; Smith, David

    2014-06-01

    Of the three most widely used implementations of the WLCG Storage Element specification, Disk Pool Manager[1, 2] (DPM) has the simplest implementation of file placement balancing (StoRM doesn't attempt this, leaving it up to the underlying filesystem, which can be very sophisticated in itself). DPM uses a round-robin algorithm (with optional filesystem weighting), for placing files across filesystems and servers. This does a reasonable job of evenly distributing files across the storage array provided to it. However, it does not offer any guarantees of the evenness of distribution of that subset of files associated with a given "dataset" (which often maps onto a "directory" in the DPM namespace (DPNS)). It is useful to consider a concept of "balance", where an optimally balanced set of files indicates that the files are distributed evenly across all of the pool nodes. The best case performance of the round robin algorithm is to maintain balance, it has no mechanism to improve balance. In the past year or more, larger DPM sites have noticed load spikes on individual disk servers, and suspected that these were exacerbated by excesses of files from popular datasets on those servers. We present here a software tool which analyses file distribution for all datasets in a DPM SE, providing a measure of the poorness of file location in this context. Further, the tool provides a list of file movement actions which will improve dataset-level file distribution, and can action those file movements itself. We present results of such an analysis on the UKI-SCOTGRID-GLASGOW Production DPM.

  10. Equity in the distribution of CT and MRI in China: a panel analysis

    PubMed Central

    2013-01-01

    Introduction China is facing a daunting challenge to health equity in the context of rapid economic development. This study adds to the literature by examining equity in the distribution of high-technology medical equipment, such as CT and MRI, in China. Methods A panel analysis was conducted with information about four study sites in 2006 and 2009. The four provincial-level study sites included Shanghai, Zhejiang, Shaanxi, and Hunan, representing different geographical, economic, and medical technology levels in China. A random sample of 71 hospitals was selected from the four sites. Data were collected through questionnaire surveys. Equity status was assessed in terms of CT and MRI numbers, characteristics of machine, and financing sources. The assessment was conducted at multiple levels, including international, provincial, city, and hospital level. In addition to comparison among the study sites, the sample was compared with OECD countries in CT and MRI distributions. Results China had lower numbers of CTs and MRIs per million population in 2009 than most of the selected OECD countries while the increases in its CT and MRI numbers from 2006 to 2009 were higher than most of the OECD countries. The equity status of CT distribution remained at low inequality level in both 2006 and 2009 while the equity status of MRI distribution improved from high inequality in 2006 to moderate inequality in 2009. Despite the equity improvement, the distributions of CTs and MRIs were significantly positively correlated with economic development level across all cities in the four study sites in either 2006 or 2009. Our analysis also revealed that Shanghai, the study site with the highest level of economic development, had more advanced CT and MRI machine, more imported CTs and MRIs, and higher government subsidies on these two types of equipment. Conclusions The number of CTs and MRIs increased considerably in China from 2006 to 2009. The equity status of CTs was better than that

  11. Poster — Thur Eve — 74: Distributed, asynchronous, reactive dosimetric and outcomes analysis using DICOMautomaton

    SciTech Connect

    Clark, Haley; Wu, Jonn; Moiseenko, Vitali; Thomas, Steven

    2014-08-15

    Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. We describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.

  12. Multispectral UV imaging for surface analysis of MUPS tablets with special focus on the pellet distribution.

    PubMed

    Novikova, Anna; Carstensen, Jens M; Rades, Thomas; Leopold, Prof Dr Claudia S

    2016-12-30

    In the present study the applicability of multispectral UV imaging in combination with multivariate image analysis for surface evaluation of MUPS tablets was investigated with respect to the differentiation of the API pellets from the excipients matrix, estimation of the drug content as well as pellet distribution, and influence of the coating material and tablet thickness on the predictive model. Different formulations consisting of coated drug pellets with two coating polymers (Aquacoat(®) ECD and Eudragit(®) NE 30 D) at three coating levels each were compressed to MUPS tablets with various amounts of coated pellets and different tablet thicknesses. The coated drug pellets were clearly distinguishable from the excipients matrix using a partial least squares approach regardless of the coating layer thickness and coating material used. Furthermore, the number of the detected drug pellets on the tablet surface allowed an estimation of the true drug content in the respective MUPS tablet. In addition, the pellet distribution in the MUPS formulations could be estimated by UV image analysis of the tablet surface. In conclusion, this study revealed that UV imaging in combination with multivariate image analysis is a promising approach for the automatic quality control of MUPS tablets during the manufacturing process.

  13. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    SciTech Connect

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates.

  14. Assessment of Forest Conservation Value Using a Species Distribution Model and Object-based Image Analysis

    NASA Astrophysics Data System (ADS)

    Jin, Y.; Lee, D. K.; Jeong, S. G.

    2015-12-01

    The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.

  15. X-ray fluorescence analysis of iron and manganese distribution in primary dopaminergic neurons

    PubMed Central

    Dučić, Tanja; Barski, Elisabeth; Salome, Murielle; Koch, Jan C; Bähr, Mathias; Lingor, Paul

    2013-01-01

    Transition metals have been suggested to play a pivotal role in the pathogenesis of Parkinson's disease. X-ray microscopy combined with a cryogenic setup is a powerful method for elemental imaging in low concentrations and high resolution in intact cells, eliminating the need for fixation and sectioning of the specimen. Here, we performed an elemental distribution analysis in cultured primary midbrain neurons with a step size in the order of 300 nm and ∼ 0.1 ppm sensitivity under cryo conditions by using X-ray fluorescence microscopy. We report the elemental mappings on the subcellular level in primary mouse dopaminergic (DAergic) and non-DAergic neurons after treatment with transition metals. Application of Fe2+ resulted in largely extracellular accumulation of iron without preference for the neuronal transmitter subtype. A quantification of different Fe oxidation states was performed using X-ray absorption near edge structure analysis. After treatment with Mn2+, a cytoplasmic/paranuclear localization of Mn was observed preferentially in DAergic neurons, while no prominent signal was detectable after Mn3+ treatment. Immunocytochemical analysis correlated the preferential Mn uptake to increased expression of voltage-gated calcium channels in DAergic neurons. We discuss the implications of this differential elemental distribution for the selective vulnerability of DAergic neurons and Parkinson's disease pathogenesis. PMID:23106162

  16. Statistical analysis of factors affecting landslide distribution in the new Madrid seismic zone, Tennessee and Kentucky

    USGS Publications Warehouse

    Jibson, R.W.; Keefer, D.K.

    1989-01-01

    More than 220 large landslides along the bluffs bordering the Mississippi alluvial plain between Cairo, Ill., and Memphis, Tenn., are analyzed by discriminant analysis and multiple linear regression to determine the relative effects of slope height and steepness, stratigraphic variation, slope aspect, and proximity to the hypocenters of the 1811-12 New Madrid, Mo., earthquakes on the distribution of these landslides. Three types of landslides are analyzed: (1) old, coherent slumps and block slides, which have eroded and revegetated features and no active analogs in the area; (2) old earth flows, which are also eroded and revegetated; and (3) young rotational slumps, which are present only along near-river bluffs, and which are the only young, active landslides in the area. Discriminant analysis shows that only one characteristic differs significantly between bluffs with and without young rotational slumps: failed bluffs tend to have sand and clay at their base, which may render them more susceptible to fluvial erosion. Bluffs having old coherent slides are significantly higher, steeper, and closer to the hypocenters of the 1811-12 earthquakes than bluffs without these slides. Bluffs having old earth flows are likewise higher and closer to the earthquake hypocenters. Multiple regression analysis indicates that the distribution of young rotational slumps is affected most strongly by slope steepness: about one-third of the variation in the distribution is explained by variations in slope steepness. The distribution of old coherent slides and earth flows is affected most strongly by slope height, but the proximity to the hypocenters of the 1811-12 earthquakes also significantly affects the distribution. The results of the statistical analyses indicate that the only recently active landsliding in the area is along actively eroding river banks, where rotational slumps formed as bluffs are undercut by the river. The analyses further indicate that the old coherent slides

  17. Tissue characterization of skin ulcer for bacterial infection by multiple statistical analysis of echo amplitude envelope

    NASA Astrophysics Data System (ADS)

    Omura, Masaaki; Yoshida, Kenji; Kohta, Masushi; Kubo, Takabumi; Ishiguro, Toshimichi; Kobayashi, Kazuto; Hozumi, Naohiro; Yamaguchi, Tadashi

    2016-07-01

    To characterize skin ulcers for bacterial infection, quantitative ultrasound (QUS) parameters were estimated by the multiple statistical analysis of the echo amplitude envelope based on both Weibull and generalized gamma distributions and the ratio of mean to standard deviation of the echo amplitude envelope. Measurement objects were three rat models (noninfection, critical colonization, and infection models). Ultrasound data were acquired using a modified ultrasonic diagnosis system with a center frequency of 11 MHz. In parallel, histopathological images and two-dimensional map of speed of sound (SoS) were observed. It was possible to detect typical tissue characteristics such as infection by focusing on the relationship of QUS parameters and to indicate the characteristic differences that were consistent with the scatterer structure. Additionally, the histopathological characteristics and SoS of noninfected and infected tissues were matched to the characteristics of QUS parameters in each rat model.

  18. Multiobjective sensitivity analysis and optimization of a distributed hydrologic model MOBIDIC

    NASA Astrophysics Data System (ADS)

    Yang, J.; Castelli, F.; Chen, Y.

    2014-03-01

    Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives which arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for a distributed hydrologic model MOBIDIC, which combines two sensitivity analysis techniques (Morris method and State Dependent Parameter method) with a multiobjective optimization (MOO) approach ϵ-NSGAII. This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina with three objective functions, i.e., standardized root mean square error of logarithmic transformed discharge, water balance index, and mean absolute error of logarithmic transformed flow duration curve, and its results were compared with those with a single objective optimization (SOO) with the traditional Nelder-Mead Simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show: (1) the two sensitivity analysis techniques are effective and efficient to determine the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization; (2) both MOO and SOO lead to acceptable simulations, e.g., for MOO, average Nash-Sutcliffe is 0.75 in the calibration period and 0.70 in the validation period; (3) evaporation and surface runoff shows similar importance to watershed water balance while the contribution of baseflow can be ignored; (4) compared to SOO which was dependent of initial starting location, MOO provides more insight on parameter sensitivity and conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization

  19. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1992-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  20. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1993-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.