Science.gov

Sample records for weibull distribution analysis

  1. /q-exponential, Weibull, and /q-Weibull distributions: an empirical analysis

    NASA Astrophysics Data System (ADS)

    Picoli, S.; Mendes, R. S.; Malacarne, L. C.

    2003-06-01

    In a comparative study, the q-exponential and Weibull distributions are employed to investigate frequency distributions of basketball baskets, cyclone victims, brand-name drugs by retail sales, and highway length. In order to analyze the intermediate cases, a distribution, the q-Weibull one, which interpolates the q-exponential and Weibull ones, is introduced. It is verified that the basketball baskets distribution is well described by a q-exponential, whereas the cyclone victims and brand-name drugs by retail sales ones are better adjusted by a Weibull distribution. On the other hand, for highway length the q-exponential and Weibull distributions do not give satisfactory adjustment, being necessary to employ the q-Weibull distribution. Furthermore, the introduction of this interpolating distribution gives an illumination from the point of view of the stretched exponential against inverse power law ( q-exponential with q>1) controversy.

  2. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  3. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  4. Reliability analysis of structural ceramic components using a three-parameter Weibull distribution

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois

    1992-01-01

    Described here are nonlinear regression estimators for the three-parameter Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.

  5. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  6. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  7. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  8. Weibull Distribution From Interval Inspection Data

    NASA Technical Reports Server (NTRS)

    Rheinfurth, Mario H.

    1987-01-01

    Most likely failure sequence assumed. Memorandum discusses application of Weibull distribution to statistics of failures of turbopump blades. Is generalization of well known exponential random probability distribution and useful in describing component-failure modes including aging effects. Parameters found from experimental data by method of maximum likelihood.

  9. Application of Weibull analysis to SSME hardware

    NASA Technical Reports Server (NTRS)

    Gray, L. A. B.

    1986-01-01

    Generally, it has been documented that the wearing of engine parts forms a failure distribution which can be approximated by a function developed by Weibull. The purpose here is to examine to what extent the Weibull distribution approximates failure data for designated engine parts of the Space Shuttle Main Engine (SSME). The current testing certification requirements will be examined in order to establish confidence levels. An examination of the failure history of SSME parts/assemblies (turbine blades, main combustion chamber, or high pressure fuel pump first stage impellers) which are limited in usage by time or starts will be done by using updated Weibull techniques. Efforts will be made by the investigator to predict failure trends by using Weibull techniques for SSME parts (turbine temperature sensors, chamber pressure transducers, actuators, and controllers) which are not severely limited by time or starts.

  10. Packing fraction of particles with a Weibull size distribution.

    PubMed

    Brouwers, H J H

    2016-07-01

    This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ_{1}, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1-φ_{1})β as function of φ_{1} is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data. PMID:27575204

  11. Packing fraction of particles with a Weibull size distribution

    NASA Astrophysics Data System (ADS)

    Brouwers, H. J. H.

    2016-07-01

    This paper addresses the void fraction of polydisperse particles with a Weibull (or Rosin-Rammler) size distribution. It is demonstrated that the governing parameters of this distribution can be uniquely related to those of the lognormal distribution. Hence, an existing closed-form expression that predicts the void fraction of particles with a lognormal size distribution can be transformed into an expression for Weibull distributions. Both expressions contain the contraction coefficient β. Likewise the monosized void fraction φ1, it is a physical parameter which depends on the particles' shape and their state of compaction only. Based on a consideration of the scaled binary void contraction, a linear relation for (1 - φ1)β as function of φ1 is proposed, with proportionality constant B, depending on the state of compaction only. This is validated using computational and experimental packing data concerning random close and random loose packing arrangements. Finally, using this β, the closed-form analytical expression governing the void fraction of Weibull distributions is thoroughly compared with empirical data reported in the literature, and good agreement is found. Furthermore, the present analysis yields an algebraic equation relating the void fraction of monosized particles at different compaction states. This expression appears to be in good agreement with a broad collection of random close and random loose packing data.

  12. Independent Orbiter Assessment (IOA): Weibull analysis report

    NASA Technical Reports Server (NTRS)

    Raffaelli, Gary G.

    1987-01-01

    The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.

  13. Weibull distribution based on maximum likelihood with interval inspection data

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.

    1985-01-01

    The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

  14. Modeling observed animal performance using the Weibull distribution.

    PubMed

    Hagey, Travis J; Puthoff, Jonathan B; Crandell, Kristen E; Autumn, Kellar; Harmon, Luke J

    2016-06-01

    To understand how organisms adapt, researchers must link performance and microhabitat. However, measuring performance, especially maximum performance, can sometimes be difficult. Here, we describe an improvement over previous techniques that only consider the largest observed values as maxima. Instead, we model expected performance observations via the Weibull distribution, a statistical approach that reduces the impact of rare observations. After calculating group-level weighted averages and variances by treating individuals separately to reduce pseudoreplication, our approach resulted in high statistical power despite small sample sizes. We fitted lizard adhesive performance and bite force data to the Weibull distribution and found that it closely estimated maximum performance in both cases, illustrating the generality of our approach. Using the Weibull distribution to estimate observed performance greatly improves upon previous techniques by facilitating power analyses and error estimations around robustly estimated maximum values. PMID:26994180

  15. Modeling observed animal performance using the Weibull distribution.

    PubMed

    Hagey, Travis J; Puthoff, Jonathan B; Crandell, Kristen E; Autumn, Kellar; Harmon, Luke J

    2016-06-01

    To understand how organisms adapt, researchers must link performance and microhabitat. However, measuring performance, especially maximum performance, can sometimes be difficult. Here, we describe an improvement over previous techniques that only consider the largest observed values as maxima. Instead, we model expected performance observations via the Weibull distribution, a statistical approach that reduces the impact of rare observations. After calculating group-level weighted averages and variances by treating individuals separately to reduce pseudoreplication, our approach resulted in high statistical power despite small sample sizes. We fitted lizard adhesive performance and bite force data to the Weibull distribution and found that it closely estimated maximum performance in both cases, illustrating the generality of our approach. Using the Weibull distribution to estimate observed performance greatly improves upon previous techniques by facilitating power analyses and error estimations around robustly estimated maximum values.

  16. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  17. Discriminating between Weibull distributions and log-normal distributions emerging in branching processes

    NASA Astrophysics Data System (ADS)

    Goh, Segun; Kwon, H. W.; Choi, M. Y.

    2014-06-01

    We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

  18. A spatial scan statistic for survival data based on Weibull distribution.

    PubMed

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions.

  19. Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.

    2012-01-01

    A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.

  20. Least Squares Best Fit Method for the Three Parameter Weibull Distribution: Analysis of Tensile and Bend Specimens with Volume or Surface Flaw Failure

    NASA Technical Reports Server (NTRS)

    Gross, Bernard

    1996-01-01

    Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.

  1. Predictive Failure of Cylindrical Coatings Using Weibull Analysis

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2002-01-01

    Rotating, coated wiping rollers used in a high-speed printing application failed primarily from fatigue. Two coating materials were evaluated: a hard, cross-linked, plasticized polyvinyl chloride (PVC) and a softer, plasticized PVC. A total of 447 tests was conducted with these coatings in a production facility. The data were evaluated using Weibull analysis. The softer coating produced more than twice the life of the harder cross-linked coating and reduced the wiper replacement rate by two-thirds, resulting in minimum production interruption.

  2. Numerical approach for the evaluation of Weibull distribution parameters for hydrologic purposes

    NASA Astrophysics Data System (ADS)

    Pierleoni, A.; Di Francesco, S.; Biscarini, C.; Manciola, P.

    2016-06-01

    In hydrology, the statistical description of low flow phenomena is very important in order to evaluate the available water resource especially in a river and the related values can be obviously considered as random variables, therefore probability distributions dealing with extreme values (maximum and/or minimum) of the variable play a fundamental role. Computational procedures for the estimation of the parameters featuring these distributions are actually very useful especially when embedded into analysis software [1][2] or as standalone applications. In this paper a computational procedure for the evaluation of the Weibull[3] distribution is presented focusing on the case when the lower limit of the distribution is not known or not set to a specific value a priori. The procedure takes advantage of the Gumbel[4] moment approach to the problem.

  3. Suitability of Gamma, Chi-square, Weibull, and Beta distributions as synthetic unit hydrographs

    NASA Astrophysics Data System (ADS)

    Bhunya, P. K.; Berndtsson, R.; Ojha, C. S. P.; Mishra, S. K.

    2007-02-01

    SummaryMost available methods for synthetic unit hydrograph (SUH) derivation involve manual, subjective fitting of a hydrograph through a few data points. Because of this tedious procedure, the generated unit hydrograph is often left unadjusted for unit runoff volume. During recent decades, use of probability distribution functions (pdfs) in developing SUH has received much attention because of its similarity with unit hydrograph properties. In this study, the potential of four popular pdfs, i.e., two-parameter Gamma, three-parameter Beta, two-parameter Weibull, and one-parameter Chi-square distribution to derive SUH have been explored. Simple formulae are derived using analytical and numerical schemes to compute the distribution parameters, and their validity is checked with simulation of field data. The Gamma and Chi-square distributions behave analogously, and the Beta distribution approximates a Gamma distribution in a limiting case. Application to field data shows that the Beta and Weibull distributions are more flexible in hydrograph prediction than the Gamma, Chi-square, Gray [Gray, D.M., 1961. Synthetic hydrographs for small drainage areas. In: Proceedings of the ASCE, 87, HY4, pp. 33-54], SCS [SCS, 1957. Use of Storm and Watershed Characteristics in Synthetic Hydrograph Analysis and Application: V. Mockus. US Dept. of Agriculture, Soil Conservation Service, Washington, DC], and Snyder [Synder, F.F., 1938. Synthetic unit hydrographs. Trans. Am. Geophys. Union 19, 447-454] methods. A sensitivity analysis of pdf parameters on peak flow estimates of an UH indicated that Gamma and Chi-square distributions overestimate the peak flow value, for any overestimation in its parameter estimates. However, for the Beta and Weibull distributions a reverse trend was observed. Both were found to behave similarly at higher α (ratio of time to base and time to peak of UH) values. Further, an analogous triangular hydrograph approach was used to express the mean and variance

  4. A Weibull distribution model for intradermal administration of ceftazidime.

    PubMed

    Bressolle, F; Laurelli, J M; Gomeni, R; Bechier, J G; Wynn, N R; Galtier, M; Eledjam, J J

    1993-11-01

    The pharmacokinetics of 1 g of ceftazidime administered intradermally was studied in seven healthy volunteers. The objective of the present study was to find the most appropriate mathematical model to describe the drug intake process. The concentration of ceftazidime in plasma was measured by HPLC. The disposition of the drug was described by a one-compartment pharmacokinetic model, with drug intake occurring by different processes: a zero-order process due to the administration and a first-order intake from the injection site to the systemic circulation. The Weibull model was considered as an approximation of the overall process. The mean Weibull parameters were td (time necessary to transfer 63% of the administered drug into the systemic circulation) of 2.75 +/- 0.75 h, and f (shape) of 1.04 +/- 0.15. The mean elimination half-life was 2.0 +/- 0.4 h. The area under the concentration versus time curve obtained in this study (139 +/- 46 mg.h/L) is very near to literature values reported after single intravenous doses of 1 g of ceftazidime, suggesting that the bioavailability of ceftazidime after intradermal administration may be approximately 100%. Moreover, the mean peak plasma concentration (37 +/- 16 mg/L) is in the same range as that reported in the literature after intramuscular administration of a single dose of 1 g. PMID:8289137

  5. Fracture Strength: Stress Concentration, Extreme Value Statistics, and the Fate of the Weibull Distribution

    NASA Astrophysics Data System (ADS)

    Bertalan, Zsolt; Shekhawat, Ashivni; Sethna, James P.; Zapperi, Stefano

    2014-09-01

    The statistical properties of fracture strength of brittle and quasibrittle materials are often described in terms of the Weibull distribution. However, the weakest-link hypothesis, commonly used to justify it, is expected to fail when fracture occurs after significant damage accumulation. Here we show that this implies that the Weibull distribution is unstable in a renormalization-group sense for a large class of quasibrittle materials. Our theoretical arguments are supported by numerical simulations of disordered fuse networks. We also find that for brittle materials such as ceramics, the common assumption that the strength distribution can be derived from the distribution of preexisting microcracks by using Griffith's criteria is invalid. We attribute this discrepancy to crack bridging. Our findings raise questions about the applicability of Weibull statistics to most practical cases.

  6. Weibull statistical analysis of Krouse type bending fatigue of nuclear materials

    NASA Astrophysics Data System (ADS)

    Haidyrah, Ahmed S.; Newkirk, Joseph W.; Castaño, Carlos H.

    2016-03-01

    A bending fatigue mini-specimen (Krouse-type) was used to study the fatigue properties of nuclear materials. The objective of this paper is to study fatigue for Grade 91 ferritic-martensitic steel using a mini-specimen (Krouse-type) suitable for reactor irradiation studies. These mini-specimens are similar in design (but smaller) to those described in the ASTM B593 standard. The mini specimen was machined by waterjet and tested as-received. The bending fatigue machine was modified to test the mini-specimen with a specially designed adapter. The cycle bending fatigue behavior of Grade 91 was studied under constant deflection. The S-N curve was created and mean fatigue life was analyzed using mean fatigue life. In this study, the Weibull function was predicted probably for high stress to low stress at 563, 310 and 265 MPa. The commercial software Minitab 17 was used to calculate the distribution of fatigue life under different stress levels. We have used 2 and 3- parameters Weibull analysis to introduce the probability of failure. The plots indicated that the 3- parameter Weibull distribution fits the data well.

  7. An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand

    ERIC Educational Resources Information Center

    Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.

    2005-01-01

    An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…

  8. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength.

    PubMed

    Krumbholz, Michael; Hieronymus, Christoph F; Burchardt, Steffi; Troll, Valentin R; Tanner, David C; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth's crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

  9. Weibull-distributed dyke thickness reflects probabilistic character of host-rock strength

    PubMed Central

    Krumbholz, Michael; Hieronymus, Christoph F.; Burchardt, Steffi; Troll, Valentin R.; Tanner, David C.; Friese, Nadine

    2014-01-01

    Magmatic sheet intrusions (dykes) constitute the main form of magma transport in the Earth’s crust. The size distribution of dykes is a crucial parameter that controls volcanic surface deformation and eruption rates and is required to realistically model volcano deformation for eruption forecasting. Here we present statistical analyses of 3,676 dyke thickness measurements from different tectonic settings and show that dyke thickness consistently follows the Weibull distribution. Known from materials science, power law-distributed flaws in brittle materials lead to Weibull-distributed failure stress. We therefore propose a dynamic model in which dyke thickness is determined by variable magma pressure that exploits differently sized host-rock weaknesses. The observed dyke thickness distributions are thus site-specific because rock strength, rather than magma viscosity and composition, exerts the dominant control on dyke emplacement. Fundamentally, the strength of geomaterials is scale-dependent and should be approximated by a probability distribution. PMID:24513695

  10. A Discussion on Prediction of Wind Conditions and Power Generation with the Weibull Distribution

    NASA Astrophysics Data System (ADS)

    Saito, Sumio; Sato, Kenichi; Sekizuka, Satoshi

    Assessment of profitability, based on the accurate measurement of the frequency distribution of wind speed over a certain period and the prediction of power generation under measured conditions, is normally a centrally important consideration for the installation of wind turbines. The frequency distribution of wind speed is evaluated, in general, using the Weibull distribution. In order to predict the frequency distribution from the average wind speed, a formula based on the Rayleigh distribution is often used, in which a shape parameter equal to 2 is assumed. The shape parameter is also used with the Weibull distribution; however, its effect on calculation of wind conditions and wind power has not been sufficiently clarified. This study reports on the evaluation of wind conditions and wind power generation as they are affected by the change of the shape parameter in the Weibull distribution with regard to two wind turbine generator systems that have the same nominal rated power, but different control methods. It further discusses the effect of the shape parameter of prototype wind turbines at a site with the measured wind condition data.

  11. Probabilistic Assessment of Earthquake Hazards: a Comparison among Gamma, Weibull, Generalized Exponential and Gamma Distributions

    NASA Astrophysics Data System (ADS)

    Pasari, S.

    2013-05-01

    Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Weibull, gamma, generalized exponential and lognormal distributions are quite established probability models in this recurrence interval estimation. Moreover these models share many important characteristics among themselves. In this paper, we aim to compare the effectiveness of these models in recurrence interval estimations and eventually in hazard analysis. To contemplate the appropriateness of these models, we use a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (200-320 N and 870-1000 E). The model parameters are estimated using modified maximum likelihood estimator (MMLE). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.

  12. Weibull statistical analysis of tensile strength of vascular bundle in inner layer of moso bamboo culm in molecular parasitology and vector biology.

    PubMed

    Le, Cui; Wanxi, Peng; Zhengjun, Sun; Lili, Shang; Guoning, Chen

    2014-07-01

    Bamboo is a radial gradient variation composite material against parasitology and vector biology, but the vascular bundles in inner layer are evenly distributed. The objective is to determine the regular size pattern and Weibull statistical analysis of the vascular bundle tensile strength in inner layer of Moso bamboo. The size and shape of vascular bundles in inner layer are similar, with an average area about 0.1550 mm2. A statistical evaluation of the tensile strength of vascular bundle was conducted by means of Weibull statistics, the results show that the Weibull modulus m is 6.1121 and the accurate reliability assessment of vascular bundle is determined.

  13. Dyke thicknesses follow a Weibull distribution controlled by host-rock strength and magmatic overpressure

    NASA Astrophysics Data System (ADS)

    Krumbholz, M.; Hieronymus, C.; Burchardt, S.; Troll, V. R.; Tanner, D. C.; Friese, N.

    2012-04-01

    Dykes are the primary transport channels of magma through the crust and form large parts of volcanic edifices and the oceanic crust. Their dimensions are primary parameters that control magma transport rates and therefore influence, e.g. the size of fissure eruptions and crustal growth. Since the mechanics of dyke emplacement are essentially similar and independent of the tectonic setting, dyke properties should generally follow the same statistical laws. The measurement of dyke thicknesses is, of all parameters, least affected by censoring and truncation effects and therefore most accurately accessible. Nevertheless, dyke thicknesses have been ascribed to follow many different statistical distributions, such as negative exponential and power law. We tested large datasets of dyke thicknesses from different tectonic settings (mid-ocean ridge, oceanic intra-plate) for different statistical distributions (log-normal, exponential, power law (with fixed or variable lower cut-off), Rayleigh, Chi-square, and Weibull). For this purpose, we first converted the probability density functions of each dataset to cumulative distribution functions, thus avoiding arbitrariness in bin size. A non-linear, least-squares fit was then used to compute the parameter(s) of the distribution function. The goodness-of-fit was evaluated using three methods: (1) the residual sum of squares, (2) the Kolmogorov-Smirnov statistics, and (3) p-values using 10,000 synthetic datasets. The results show that, in general, dyke thickness is best described by a Weibull distribution. This suggests material strength is a function of the dimensions of included weaknesses (e.g. fractures), following the "weakest link of a chain" principle. Our datasets may be further subdivided according to dyke lithology (magma type) and type (regional dyke vs. inclined sheet), which leads to an increasingly better fit of the Weibull distribution. Weibull is hence the statistical distribution that universally describes dyke

  14. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  15. Bonus-Malus System with the Claim Frequency Distribution is Geometric and the Severity Distribution is Truncated Weibull

    NASA Astrophysics Data System (ADS)

    Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.

    2016-01-01

    Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.

  16. A Bayesian estimation on right censored survival data with mixture and non-mixture cured fraction model based on Beta-Weibull distribution

    NASA Astrophysics Data System (ADS)

    Yusuf, Madaki Umar; Bakar, Mohd. Rizam B. Abu

    2016-06-01

    Models for survival data that includes the proportion of individuals who are not subject to the event under study are known as a cure fraction models or simply called long-term survival models. The two most common models used to estimate the cure fraction are the mixture model and the non-mixture model. in this work, we present mixture and the non-mixture cure fraction models for survival data based on the beta-Weibull distribution. This four parameter distribution has been proposed as an alternative extension of the Weibull distribution in the analysis of lifetime data. This approach allows the inclusion of covariates in the models, where the estimation of the parameters was obtained under a Bayesian approach using Gibbs sampling methods.

  17. Flexural strength of infrared-transmitting window materials: bimodal Weibull statistical analysis

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.

    2011-02-01

    The results of flexural strength testing performed on brittle materials are usually interpreted in light of a ``Weibull plot,'' i.e., by fitting the estimated cumulative failure probability (CFP) to a linearized semiempirical Weibull distribution. This procedure ignores the impact of the testing method on the measured stresses at fracture--specifically, the stressed area and the stress profile--thus resulting in inadequate characterization of the material under investigation. In a previous publication, the author reformulated Weibull's statistical theory of fracture in a manner that emphasizes how the stressed area and the stress profile control the failure probability distribution, which led to the concept of a characteristic strength, that is, the effective strength of a 1-cm2 uniformly stressed area. Fitting the CFP of IR-transmitting materials (AlON, fusion-cast CaF2, oxyfluoride glass, fused SiO2, CVD-ZnSe, and CVD-ZnS) was performed by means of nonlinear regressions but produced evidence of slight, systematic deviations. The purpose of this contribution is to demonstrate that upon extending the previously elaborated model to distributions involving two distinct types of defects--bimodal distributions--the fit agrees with estimated CFPs. Furthermore, the availability of two sets of statistical parameters (characteristic strength and shape parameter) can be taken advantage of to evaluate the failure-probability density, thus providing means of assessing the nature, the critical size, and the size distribution of surface/subsurface flaws.

  18. USE OF WEIBULL FUNCTION FOR NON-LINEAR ANALYSIS OF EFFECTS OF LOW LEVELS OF SIMULATED HERBICIDE DRIFT ON PLANTS

    EPA Science Inventory

    We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...

  19. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2013-01-01

    Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives

  20. Average capacity for optical wireless communication systems over exponentiated Weibull distribution non-Kolmogorov turbulent channels.

    PubMed

    Cheng, Mingjian; Zhang, Yixin; Gao, Jie; Wang, Fei; Zhao, Fengsheng

    2014-06-20

    We model the average channel capacity of optical wireless communication systems for cases of weak to strong turbulence channels, using the exponentiation Weibull distribution model. The joint effects of the beam wander and spread, pointing errors, atmospheric attenuation, and the spectral index of non-Kolmogorov turbulence on system performance are included. Our results show that the average capacity decreases steeply as the propagation length L changes from 0 to 200 m and decreases slowly down or tends to a stable value as the propagation length L is greater than 200 m. In the weak turbulence region, by increasing the detection aperture, we can improve the average channel capacity and the atmospheric visibility as an important issue affecting the average channel capacity. In the strong turbulence region, the increase of the radius of the detection aperture cannot reduce the effects of the atmospheric turbulence on the average channel capacity, and the effect of atmospheric visibility on the channel information capacity can be ignored. The effect of the spectral power exponent on the average channel capacity in the strong turbulence region is higher than weak turbulence region. Irrespective of the details determining the turbulent channel, we can say that pointing errors have a significant effect on the average channel capacity of optical wireless communication systems in turbulence channels.

  1. Inferences on the lifetime performance index for Weibull distribution based on censored observations using the max p-value method

    NASA Astrophysics Data System (ADS)

    Lee, Wen-Chuan

    2011-06-01

    In the service (or manufacturing) industries, process capability indices (PCIs) are utilised to assess whether product quality meets the required level. And the lifetime performance index (or larger-the-better PCI) CL is frequently used as a means of measuring product performance, where L is the lower specification limit. Hence, this study first uses the max p-value method to select the optimum value of the shape parameter β of the Weibull distribution and β is given. Second, we also construct the maximum likelihood estimator (MLE) of CL based on the type II right-censored sample from the Weibull distribution. The MLE of CL is then utilised to develop a novel hypothesis testing procedure provided that L is known. Finally, we give one practical example to illustrate the use of the testing procedure under given significance level α.

  2. Mixture and non-mixture cure fraction models based on the generalized modified Weibull distribution with an application to gastric cancer data.

    PubMed

    Martinez, Edson Z; Achcar, Jorge A; Jácome, Alexandre A A; Santos, José S

    2013-12-01

    The cure fraction models are usually used to model lifetime time data with long-term survivors. In the present article, we introduce a Bayesian analysis of the four-parameter generalized modified Weibull (GMW) distribution in presence of cure fraction, censored data and covariates. In order to include the proportion of "cured" patients, mixture and non-mixture formulation models are considered. To demonstrate the ability of using this model in the analysis of real data, we consider an application to data from patients with gastric adenocarcinoma. Inferences are obtained by using MCMC (Markov Chain Monte Carlo) methods.

  3. Bayesian Weibull tree models for survival analysis of clinico-genomic data

    PubMed Central

    Clarke, Jennifer; West, Mike

    2008-01-01

    An important goal of research involving gene expression data for outcome prediction is to establish the ability of genomic data to define clinically relevant risk factors. Recent studies have demonstrated that microarray data can successfully cluster patients into low- and high-risk categories. However, the need exists for models which examine how genomic predictors interact with existing clinical factors and provide personalized outcome predictions. We have developed clinico-genomic tree models for survival outcomes which use recursive partitioning to subdivide the current data set into homogeneous subgroups of patients, each with a specific Weibull survival distribution. These trees can provide personalized predictive distributions of the probability of survival for individuals of interest. Our strategy is to fit multiple models; within each model we adopt a prior on the Weibull scale parameter and update this prior via Empirical Bayes whenever the sample is split at a given node. The decision to split is based on a Bayes factor criterion. The resulting trees are weighted according to their relative likelihood values and predictions are made by averaging over models. In a pilot study of survival in advanced stage ovarian cancer we demonstrate that clinical and genomic data are complementary sources of information relevant to survival, and we use the exploratory nature of the trees to identify potential genomic biomarkers worthy of further study. PMID:18618012

  4. Improvement in mechanical properties of jute fibres through mild alkali treatment as demonstrated by utilisation of the Weibull distribution model.

    PubMed

    Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam

    2012-03-01

    Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%.

  5. Improvement in mechanical properties of jute fibres through mild alkali treatment as demonstrated by utilisation of the Weibull distribution model.

    PubMed

    Roy, Aparna; Chakraborty, Sumit; Kundu, Sarada Prasad; Basak, Ratan Kumar; Majumder, Subhasish Basu; Adhikari, Basudam

    2012-03-01

    Chemically modified jute fibres are potentially useful as natural reinforcement in composite materials. Jute fibres were treated with 0.25%-1.0% sodium hydroxide (NaOH) solution for 0.5-48 h. The hydrophilicity, surface morphology, crystallinity index, thermal and mechanical characteristics of untreated and alkali treated fibres were studied.The two-parameter Weibull distribution model was applied to deal with the variation in mechanical properties of the natural fibres. Alkali treatment enhanced the tensile strength and elongation at break by 82% and 45%, respectively but decreased the hydrophilicity by 50.5% and the diameter of the fibres by 37%. PMID:22209134

  6. An incentive for coordination in a decentralised service chain with a Weibull lifetime distributed facility

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Fang; Yang, Gino K.; Yang, Chyn-Yng; Chu, Tu-Bin

    2013-10-01

    This article deals with a decentralised service chain consisting of a service provider and a facility owner. The revenue allocation and service price are, respectively, determined by the service provider and the facility owner in a non-cooperative manner. To model this decentralised operation, a Stackelberg game between the two parties is formulated. In the mathematical framework, the service system is assumed to be driven by Poisson customer arrivals and exponential service times. The most common log-linear service demand and Weibull facility lifetime are also adopted. Under these analytical conditions, the decentralised decisions in this game are investigated and then a unique optimal equilibrium is derived. Finally, a coordination mechanism is proposed to improve the efficiency of this decentralised system.

  7. Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.

    PubMed

    Khandaker, Morshed; Ekwaro-Osire, Stephen

    2013-01-01

    The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone. PMID:24385985

  8. How to do a Weibull statistical analysis of flexural strength data: application to AlON, diamond, zinc selenide, and zinc sulfide

    NASA Astrophysics Data System (ADS)

    Klein, Claude A.; Miller, Richard P.

    2001-09-01

    For the purpose of assessing the strength of engineering ceramics, it is common practice to interpret the measured stresses at fracture in the light of a semi-empirical expression derived from Weibull's theory of brittle fracture, i.e., ln[-ln(1-P)]=-mln((sigma) N)+mln((sigma) ), where P is the cumulative failure probability, (sigma) is the applied tensile stress, m is the Weibull modulus, and (sigma) N is the nominal strength. The strength of (sigma) N, however, does not represent a true measure because it depends not only on the test method but also on the size of the volume or the surface subjected to tensile stresses. In this paper we intend to first clarify issues relating to the application of Weibull's theory of fracture and then make use of the theory to assess the results of equibiaxial flexure testing that was carried out on polycrystalline infrared-transmitting materials. These materials are brittle ceramics, which most frequently fail as a consequence of tensile stresses acting on surface flaws. Since equibiaxial flexure testing is the preferred method of measuring the strength of optical ceramics, we propose to formulate the failure-probability equation in terms of a characteristic strength, (sigma) C, for biaxial loadings, i.e., P=1-exp{-(pi) (ro/cm)2[(Gamma) (1+1/m)]m((sigma) /(sigma) C)m}, where ro is the radius of the loading ring (in centimeter) and (Gamma) (z) designates the gamma function. A Weibull statistical analysis of equibiaxial strength data thus amounts to obtaining the parameters m and (sigma) C, which is best done by directly fitting estimated Pi vs i data to the failure-probability equation; this procedure avoids distorting the distribution through logarithmic linearization and can be implemented by performing a non-linear bivariate regression. Concentric- ring fracture testing performed on five sets of Raytran materials validates the procedure in the sense that the two parameters model appears to describe the experimental failure

  9. A practical and systematic review of Weibull statistics for reporting strengths of dental materials

    PubMed Central

    Quinn, George D.; Quinn, Janet B.

    2011-01-01

    Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745

  10. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  11. Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model: A complementary data set.

    PubMed

    Jo, Byung Wan; Chakraborty, Sumit; Kim, Heon

    2015-09-01

    This data article provides a comparison data for nano-cement based concrete (NCC) and ordinary Portland cement based concrete (OPCC). Concrete samples (OPCC) were fabricated using ten different mix design and their characterization data is provided here. Optimization of curing time using the Weibull distribution model was done by analyzing the rate of change of compressive strength of the OPCC. Initially, the compressive strength of the OPCC samples was measured after completion of four desired curing times. Thereafter, the required curing time to achieve a particular rate of change of the compressive strength has been predicted utilizing the equation derived from the variation of the rate of change of compressive strength with the curing time, prior to the optimization of the curing time (at the 99.99% confidence level) using the Weibull distribution model. This data article complements the research article entitled "Prediction of the curing time to achieve maturity of the nano-cement based concrete using the Weibull distribution model" [1].

  12. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning.

    PubMed

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.

  13. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning.

    PubMed

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  14. Quality-Related Monitoring and Grading of Granulated Products by Weibull-Distribution Modeling of Visual Images with Semi-Supervised Learning

    PubMed Central

    Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong

    2016-01-01

    The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703

  15. SER performance analysis of MPPM FSO system with three decision thresholds over exponentiated Weibull fading channels

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Yang, Bensheng; Guo, Lixin; Shang, Tao

    2015-11-01

    In this work, the symbol error rate (SER) performance of the multiple pulse position modulation (MPPM) based free-space optical communication (FSO) system with three different decision thresholds, fixed decision threshold (FDT), optimized decision threshold (ODT) and dynamic decision threshold (DDT) over exponentiated Weibull (EW) fading channels has been investigated in detail. The effects of aperture averaging on each decision threshold under weak-to-strong turbulence conditions are further studied and compared. The closed-form SER expressions for three thresholds derived with the help of generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulations. This work is helpful for the design of receivers for FSO communication systems.

  16. Modeling the reliability and maintenance costs of wind turbines using Weibull analysis

    SciTech Connect

    Vachon, W.A.

    1996-12-31

    A general description is provided of the basic mathematics and use of Weibull statistical models for modeling component failures and maintenance costs as a function of time. The applicability of the model to wind turbine components and subsystems is discussed with illustrative examples of typical component reliabilities drawn from actual field experiences. Example results indicate the dominant role of key subsystems based on a combination of their failure frequency and repair/replacement costs. The value of the model is discussed as a means of defining (1) maintenance practices, (2) areas in which to focus product improvements, (3) spare parts inventory, and (4) long-term trends in maintenance costs as an important element in project cash flow projections used by developers, investors, and lenders. 6 refs., 8 figs., 3 tabs.

  17. Estimating Age Distributions of Base Flow in Watersheds Underlain by Single and Dual Porosity Formations Using Groundwater Transport Simulation and Weighted Weibull Functions

    NASA Astrophysics Data System (ADS)

    Sanford, W. E.

    2015-12-01

    Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially

  18. Evaluation of uncertainty in experimental active buckling control of a slender beam-column with disturbance forces using Weibull analysis

    NASA Astrophysics Data System (ADS)

    Enss, Georg C.; Platz, Roland

    2016-10-01

    Buckling of slender load-bearing beam-columns is a crucial failure scenario in light-weight structures as it may result in the collapse of the entire structure. If axial load and load capacity are unknown, stability becomes uncertain. To compensate this uncertainty, the authors successfully developed and evaluated an approach for active buckling control for a slender beam-column, clamped at the base and pinned at the upper end. Active lateral forces are applied with two piezoelectric stack actuators in opposing directions near the beam-column' clamped base to prevent buckling. A Linear Quadratic Regulator is designed and implemented on the experimental demonstrator and statistical tests are conducted to prove effectivity of the active approach. The load capacity of the beam-column could be increased by 40% and scatter of buckling occurrences for increasing axial loads is reduced. Weibull analysis is used to evaluate the increase of the load capacity and its related uncertainty compensation.

  19. Kinetic modeling of native Cassava starch thermo-oxidative degradation using Weibull and Weibull-derived models.

    PubMed

    Janković, Bojan

    2014-01-01

    A new approach in kinetic modeling of thermo-oxidative degradation process of starch granules extracted from the Cassava roots was developed. Based on the thermoanalytical measurements, three reaction stages were detected. Using Weibull and Weibull-derived (inverse) models, it was found that the first two reaction stages could be described with the change of apparent activation energy (Ea) on conversion fraction (α(T)) (using "Model-free" analysis). It was found that first reaction stage, which involves dehydration and evaporation of lower molecular mass fractions, can be described with an inverse Weibull model. This model with its distribution of Ea values and derived distribution parameters includes the occurrence of three-dimensional diffusion mechanism. The second reaction stage is very complex, and it was found to contain the system of simultaneous reactions (where depolymerization occurs), and can be described with standard Weibull model. Identified statistical model with its distribution of Ea values and derived distribution parameters includes the kinetic model that gives the variable reaction order values. Based on the established models, shelf-life studies for first two stages were carried out. Shelf-life testing has shown that optimal dehydration time is achieved by a programmed heating at medium heating rate, whereas optimal time of degradation is achieved at highest heating rate.

  20. Kinetic modeling of native Cassava starch thermo-oxidative degradation using Weibull and Weibull-derived models.

    PubMed

    Janković, Bojan

    2014-01-01

    A new approach in kinetic modeling of thermo-oxidative degradation process of starch granules extracted from the Cassava roots was developed. Based on the thermoanalytical measurements, three reaction stages were detected. Using Weibull and Weibull-derived (inverse) models, it was found that the first two reaction stages could be described with the change of apparent activation energy (Ea) on conversion fraction (α(T)) (using "Model-free" analysis). It was found that first reaction stage, which involves dehydration and evaporation of lower molecular mass fractions, can be described with an inverse Weibull model. This model with its distribution of Ea values and derived distribution parameters includes the occurrence of three-dimensional diffusion mechanism. The second reaction stage is very complex, and it was found to contain the system of simultaneous reactions (where depolymerization occurs), and can be described with standard Weibull model. Identified statistical model with its distribution of Ea values and derived distribution parameters includes the kinetic model that gives the variable reaction order values. Based on the established models, shelf-life studies for first two stages were carried out. Shelf-life testing has shown that optimal dehydration time is achieved by a programmed heating at medium heating rate, whereas optimal time of degradation is achieved at highest heating rate. PMID:23640748

  1. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  2. Weibull Wind-Speed Distribution Parameters Derived from a Combination of Wind-Lidar and Tall-Mast Measurements Over Land, Coastal and Marine Sites

    NASA Astrophysics Data System (ADS)

    Gryning, Sven-Erik; Floors, Rogier; Peña, Alfredo; Batchvarova, Ekaterina; Brümmer, Burghard

    2016-05-01

    Wind-speed observations from tall towers are used in combination with observations up to 600 m in altitude from a Doppler wind lidar to study the long-term conditions over suburban (Hamburg), rural coastal (Høvsøre) and marine (FINO3) sites. The variability in the wind field among the sites is expressed in terms of mean wind speed and Weibull distribution shape-parameter profiles. The consequences of the carrier-to-noise-ratio ( CNR) threshold-value choice on the wind-lidar observations are revealed as follows. When the wind-lidar CNR is lower than a prescribed threshold value, the observations are often filtered out as the uncertainty in the wind-speed measurements increases. For a pulsed heterodyne Doppler lidar, use of the traditional -22 dB CNR threshold value at all measuring levels up to 600 m results in a ≈ 7 % overestimation in the long-term mean wind speed over land, and a ≈ 12 % overestimation in coastal and marine environments. In addition, the height of the profile maximum of the shape parameter of the Weibull distribution (so-called reversal height) is found to depend on the applied CNR threshold; it is found to be lower at small CNR threshold values. The reversal height is greater in the suburban (high roughness) than in the rural (low roughness) area. In coastal areas the reversal height is lower than that over land and relates to the internal boundary layer that develops downwind from the coastline. Over the sea the shape parameter increases towards the sea surface. A parametrization of the vertical profile of the shape parameter fits well with observations over land, coastal regions and over the sea. An applied model for the dependence of the reversal height on the surface roughness is in good agreement with the observations over land.

  3. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Shantaram, S. Pai; Gyekenyesi, John P.

    1989-01-01

    The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  4. EVALUATION OF SPRING OPERATED RELIEF VALVE MAINTENANCE INTERVALS AND EXTENSION OF MAINTENANCE TIMES USING A WEIBULL ANALYSIS WITH MODIFIED BAYESIAN UPDATING

    SciTech Connect

    Harris, S.; Gross, R.; Mitchell, E.

    2011-01-18

    The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

  5. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  6. Finite-size effects on return interval distributions for weakest-link-scaling systems.

    PubMed

    Hristopulos, Dionissios T; Petrakis, Manolis P; Kaniadakis, Giorgio

    2014-05-01

    The Weibull distribution is a commonly used model for the strength of brittle materials and earthquake return intervals. Deviations from Weibull scaling, however, have been observed in earthquake return intervals and the fracture strength of quasibrittle materials. We investigate weakest-link scaling in finite-size systems and deviations of empirical return interval distributions from the Weibull distribution function. Our analysis employs the ansatz that the survival probability function of a system with complex interactions among its units can be expressed as the product of the survival probability functions for an ensemble of representative volume elements (RVEs). We show that if the system comprises a finite number of RVEs, it obeys the κ-Weibull distribution. The upper tail of the κ-Weibull distribution declines as a power law in contrast with Weibull scaling. The hazard rate function of the κ-Weibull distribution decreases linearly after a waiting time τ(c) ∝ n(1/m), where m is the Weibull modulus and n is the system size in terms of representative volume elements. We conduct statistical analysis of experimental data and simulations which show that the κ Weibull provides competitive fits to the return interval distributions of seismic data and of avalanches in a fiber bundle model. In conclusion, using theoretical and statistical analysis of real and simulated data, we demonstrate that the κ-Weibull distribution is a useful model for extreme-event return intervals in finite-size systems. PMID:25353774

  7. A meta-analysis of estimates of the AIDS incubation distribution.

    PubMed

    Cooley, P C; Myers, L E; Hamill, D N

    1996-06-01

    Information from 12 studies is combined to estimate the AIDS incubation distribution with greater precision than is possible from a single study. The analysis uses a hierarchy of parametric models based on a four-parameter generalized F distribution. This general model contains four standard two-parameter distributions as special cases. The cases are the Weibull, gamma, log-logistic, lognormal distributions. These four special cases subsume three distinct asymptotic hazard behaviors. As time increases beyond the median of approximately 10 years, the hazard can increase to infinity (Weibull), can plateau at some constant level (gamma), or can decrease to zero (log-logistic and lognormal). The Weibull, gamma and 'log-logistic distributions' which represent the three distinct asymptotic hazard behaviors, all fit the data as well as the generalized F distribution at the 25 percent significance level. Hence, we conclude that incubation data is still too limited to ascertain the specific hazard assumption that should be utilized in studies of the AIDS epidemic. Accordingly, efforts to model the AIDS epidemic (e.g., back-calculation approaches) should allow the incubation distribution to take several forms to adequately represent HIV estimation uncertainty. It is recommended that, at a minimum, the specific Weibull, gamma and log-logistic distributions estimated in this meta-analysis should all be used in modeling the AIDS epidemic, to reflect this uncertainty.

  8. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed.

  9. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. PMID:26121186

  10. A Weibull characterization for tensile fracture of multicomponent brittle fibers

    NASA Technical Reports Server (NTRS)

    Barrows, R. G.

    1977-01-01

    Necessary to the development and understanding of brittle fiber reinforced composites is a means to statistically describe fiber strength and strain-to-failure behavior. A statistical characterization for multicomponent brittle fibers is presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.

  11. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    PubMed

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively.

  12. Empirical model based on Weibull distribution describing the destruction kinetics of natural microbiota in pineapple (Ananas comosus L.) puree during high-pressure processing.

    PubMed

    Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas

    2015-10-15

    High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. PMID:26202323

  13. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Gyekenyesi, John P.

    1988-01-01

    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  14. Performance of heterodyne differential phase-shift keying system over double Weibull free-space optical channel

    NASA Astrophysics Data System (ADS)

    Musa Hasan, Omar

    2015-06-01

    In this paper, the bit error rate (BER), outage probability, and outage rate analysis of the heterodyne differential phase-shift keying system over double Weibull-distributed free-space optical channel (FSO) are proposed. The channel statistics are modeled based on the scintillation theory and derived as the product of two independent Weibull random variables. Novel closed-form expressions for evaluating BER, outage probability, and outage rate are derived taking into account the effect of turbulence strength and inner-scale turbulent cell size. Numerical results are provided to evaluate the FSO system performance for weak to strong turbulence channel conditions and inner-scale turbulent cell size. The BER, outage probability, and outage rate performance are displayed for different values of turbulence strength conditions, inner-scale values and signal-to-noise ratios.

  15. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  16. Measuring the Weibull modulus of microscope slides

    NASA Technical Reports Server (NTRS)

    Sorensen, Carl D.

    1992-01-01

    The objectives are that students will understand why a three-point bending test is used for ceramic specimens, learn how Weibull statistics are used to measure the strength of brittle materials, and appreciate the amount of variation in the strength of brittle materials with low Weibull modulus. They will understand how the modulus of rupture is used to represent the strength of specimens in a three-point bend test. In addition, students will learn that a logarithmic transformation can be used to convert an exponent into the slope of a straight line. The experimental procedures are explained.

  17. Transmission overhaul and replacement predictions using Weibull and renewel theory

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1989-01-01

    A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.

  18. Outage performance of multihop free-space optical communication system over exponentiated Weibull fading channels with nonzero boresight pointing errors

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-xia; Wang, Ping; Cao, Tian

    2016-09-01

    The outage performance of the multihop free-space optical (FSO) communication system with decode-and-forward (DF) protocol is studied by considering the joint effects of nonzero boresight pointing errors and atmospheric turbulence modeled by exponentiated Weibull (EW) distribution. The closed-form analytical expression of outage probability is derived, and the results are validated through Monte Carlo simulation. Furthermore, the detailed analysis is provided to evaluate the impacts of turbulence strength, receiver aperture size, boresight displacement, beamwidth and number of relays on the outage performance for the studied system.

  19. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGESBeta

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  20. Weibull- k Revisited: "Tall" Profiles and Height Variation of Wind Statistics

    NASA Astrophysics Data System (ADS)

    Kelly, Mark; Troen, Ib; Jørgensen, Hans E.

    2014-07-01

    The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter ( k), as well as mean wind speed. Towards the aim of improving predictions of the Weibull- profile, we develop expressions for the profile of long-term variance of wind speed, including a method extending our probabilistic wind-profile theory; together these two profiles lead to a profile of Weibull-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85-110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull- k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion of the models' efficacy and applicability. The latter includes a comparative evaluation of Wieringa-type empirical models and perturbed-geostrophic forms with regard to surface-layer behaviour, as well as for heights where climatological wind-speed variability is not dominated by surface effects.

  1. Distributed analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  2. Expected value of sample information for Weibull survival data.

    PubMed

    Brennan, Alan; Kharroubi, Samer A

    2007-11-01

    Expected value of sample information (EVSI) involves simulating data collection, Bayesian updating, and re-examining decisions. Bayesian updating in Weibull models typically requires Markov chain Monte Carlo (MCMC). We examine five methods for calculating posterior expected net benefits: two heuristic methods (data lumping and pseudo-normal); two Bayesian approximation methods (Tierney & Kadane, Brennan & Kharroubi); and the gold standard MCMC. A case study computes EVSI for 25 study options. We compare accuracy, computation time and trade-offs of EVSI versus study costs. Brennan & Kharroubi (B&K) approximates expected net benefits to within +/-1% of MCMC. Other methods, data lumping (+54%), pseudo-normal (-5%) and Tierney & Kadane (+11%) are less accurate. B&K also produces the most accurate EVSI approximation. Pseudo-normal is also reasonably accurate, whilst Tierney & Kadane consistently underestimates and data lumping exhibits large variance. B&K computation is 12 times faster than the MCMC method in our case study. Though not always faster, B&K provides most computational efficiency when net benefits require appreciable computation time and when many MCMC samples are needed. The methods enable EVSI computation for economic models with Weibull survival parameters. The approach can generalize to complex multi-state models and to survival analyses using other smooth parametric distributions. PMID:17328046

  3. ATLAS reliability analysis

    SciTech Connect

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  4. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  5. Collective Weibull behavior of social atoms: Application of the rank-ordering statistics to historical extreme events

    NASA Astrophysics Data System (ADS)

    Chen, Chien-Chih; Tseng, Chih-Yuan; Telesca, Luciano; Chi, Sung-Ching; Sun, Li-Chung

    2012-02-01

    Analogous to crustal earthquakes in natural fault systems, we here consider the dynasty collapses as extreme events in human society. Duration data of ancient Chinese and Egyptian dynasties provides a good chance of exploring the collective behavior of the so-called social atoms. By means of the rank-ordering statistics, we demonstrate that the duration data of those ancient dynasties could be described with good accuracy by the Weibull distribution. It is thus amazing that the distribution of time to failure of human society, i.e. the disorder of a historical dynasty, follows the widely accepted Weibull process as natural material fails.

  6. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  7. The distribution of first-passage times and durations in FOREX and future markets

    NASA Astrophysics Data System (ADS)

    Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico

    2009-07-01

    Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting

  8. A Weibull brittle material failure model for the ABAQUS computer program

    SciTech Connect

    Bennett, J.

    1991-08-01

    A statistical failure theory for brittle materials that traces its origins to the Weibull distribution function is developed for use in the general purpose ABAQUS finite element computer program. One of the fundamental assumptions for this development is that Mode 1 microfractures perpendicular to the direction of the principal stress contribute independently to the fast fracture. The theory is implemented by a user subroutine for ABAQUS. Example problems illustrating the capability and accuracy of the model are given. 24 refs., 12 figs.

  9. Biological implications of the Weibull and Gompertz models of aging.

    PubMed

    Ricklefs, Robert E; Scheuerlein, Alex

    2002-02-01

    Gompertz and Weibull functions imply contrasting biological causes of demographic aging. The terms describing increasing mortality with age are multiplicative and additive, respectively, which could result from an increase in the vulnerability of individuals to extrinsic causes in the Gompertz model and the predominance of intrinsic causes at older ages in the Weibull model. Experiments that manipulate extrinsic mortality can distinguish these biological models. To facilitate analyses of experimental data, we defined a single index for the rate of aging (omega) for the Weibull and Gompertz functions. Each function described the increase in aging-related mortality in simulated ages at death reasonably well. However, in contrast to the Weibull omega(W), the Gompertz omega(G) was sensitive to variation in the initial mortality rate independently of aging-related mortality. Comparisons between wild and captive populations appear to support the intrinsic-causes model for birds, but give mixed support for both models in mammals.

  10. Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.

    PubMed

    Phoenix, S Leigh; Newman, William I

    2009-12-01

    Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/2distribution and a

  11. Time-dependent fiber bundles with local load sharing. II. General Weibull fibers

    NASA Astrophysics Data System (ADS)

    Phoenix, S. Leigh; Newman, William I.

    2009-12-01

    Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ρ , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, β . Thus the failure rate of a fiber depends on its past load history, except for β=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (β,ρ) pairs that yield contrasting behavior for large N . For ρ>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N→∞ , unlike ELS, which yields a finite limiting mean. For 1/2≤ρ≤1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ρ=1 ) with an asymptotic Gaussian lifetime distribution and a

  12. A Weibull multi-state model for the dependence of progression-free survival and overall survival.

    PubMed

    Li, Yimei; Zhang, Qiang

    2015-07-30

    In oncology clinical trials, overall survival, time to progression, and progression-free survival are three commonly used endpoints. Empirical correlations among them have been published for different cancers, but statistical models describing the dependence structures are limited. Recently, Fleischer et al. proposed a statistical model that is mathematically tractable and shows some flexibility to describe the dependencies in a realistic way, based on the assumption of exponential distributions. This paper aims to extend their model to the more flexible Weibull distribution. We derived theoretical correlations among different survival outcomes, as well as the distribution of overall survival induced by the model. Model parameters were estimated by the maximum likelihood method and the goodness of fit was assessed by plotting estimated versus observed survival curves for overall survival. We applied the method to three cancer clinical trials. In the non-small-cell lung cancer trial, both the exponential and the Weibull models provided an adequate fit to the data, and the estimated correlations were very similar under both models. In the prostate cancer trial and the laryngeal cancer trial, the Weibull model exhibited advantages over the exponential model and yielded larger estimated correlations. Simulations suggested that the proposed Weibull model is robust for data generated from a range of distributions.

  13. Comparison of Weibull strength parameters from flexure and spin tests of brittle materials

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1991-01-01

    Fracture data from five series of four point bend tests of beam and spin tests of flat annular disks were reanalyzed. Silicon nitride and graphite were the test materials. The experimental fracture strengths of the disks were compared with the predicted strengths based on both volume flaw and surface flaw analyses of four point bend data. Volume flaw analysis resulted in a better correlation between disks and beams in three of the five test series than did surface flaw analysis. The Weibull (moduli) and characteristic gage strengths for the disks and beams were also compared. Differences in the experimental Weibull slopes were not statistically significant. It was shown that results from the beam tests can predict the fracture strength of rotating disks.

  14. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  15. Incorporating finite element analysis into component life and reliability

    NASA Technical Reports Server (NTRS)

    August, Richard; Zaretsky, Erwin V.

    1991-01-01

    A method for calculating a component's design survivability by incorporating finite element analysis and probabilistic material properties was developed. The method evaluates design parameters through direct comparisons of component survivability expressed in terms of Weibull parameters. The analysis was applied to a rotating disk with mounting bolt holes. The highest probability of failure occurred at, or near, the maximum shear stress region of the bolt holes. Distribution of failure as a function of Weibull slope affects the probability of survival. Where Weibull parameters are unknown for a rotating disk, it may be permissible to assume Weibull parameters, as well as the stress-life exponent, in order to determine the disk speed where the probability of survival is highest.

  16. Effect of thermocycling on flexural strength and weibull statistics of machinable glass-ceramic and composite resin.

    PubMed

    Peampring, Chaimongkon; Sanohkan, Sasiwimol

    2014-12-01

    To evaluate the durability of machinable dental restorative materials, this study performed an experiment to evaluate the flexural strength and Weibull statistics of a machinable lithium disilicate glass-ceramic and a machinable composite resin after being thermocycled for certain cycles. A total of 40 bar-shape specimens of were prepared with the dimension of 20 mm × 4 mm × 2 mm, which were divided into four groups of 10 specimens. Ten specimens of machinable lithium disilicate glass-ceramic (IPS e.max CAD, Ivoclar Vivadent, Liechtenstein) and 10 specimens of machinable composite resin (Paradigm MZ 100, 3M ESPE, USA) were subjected to 3-point flexural strength test. Other 10 specimens of each material were thermocycled between water temperature of 5 and 55 °C for 10,000 cycles. After that, they were tested using 3-point flexural strength test. Statistical analysis was performed using two-way analysis of variance and Tukey multiple comparisons. Weibull analysis was performed to evaluate the reliability of the strength. Means of strength and their standard deviation were: thermocycled IPS e.max CAD 389.10 (50.75), non-thermocycled IPS e.max CAD 349.96 (38.34), thermocycled Paradigm MZ 100 157.51 (12.85), non-thermocycled Paradigm MZ 100 153.33 (19.97). Within each material group, there was no significant difference in flexural strength between thermocycled and non-thermocycled specimens. Considering the Weibull analysis, there was no statistical difference of Weibull modulus in all experimental groups. Within the limitation of this study, the results showed that there was no significant effect of themocycling on flexural strength and Weibull modulus of a machinable glass-ceramic and a machinable composite resin.

  17. Sampling times influence the estimate of parameters in the Weibull dissolution model.

    PubMed

    Cupera, Jakub; Lansky, Petr; Sklubalova, Zdenka

    2015-10-12

    The aim is to determine how well the parameters of the Weibull model of dissolution can be estimated in dependency on the chosen times to measure the empirical data. The approach is based on the theory of Fisher information. We show that in order to obtain the best estimates the data should be collected at time instants when tablets actively dissolve or at their close proximity. This is in a sharp contrast with commonly used experimental protocols when sampling times are distributed rather uniformly.

  18. Weibull Statistics for Upper Ocean Currents with the Fokker-Planck Equation

    NASA Astrophysics Data System (ADS)

    Chu, P. C.

    2012-12-01

    Upper oceans typically exhibit of a surface mixed layer with a thickness of a few to several hundred meters. This mixed layer is a key component in studies of climate, biological productivity and marine pollution. It is the link between the atmosphere and the deep ocean and directly affects the air-sea exchange of heat, momentum and gases. Vertically averaged horizontal currents across the mixed layer are driven by the residual between the Ekman transport and surface wind stress, and damped by the Rayleigh friction. A set of stochastic differential equations are derived for the two components of the current vector (u, v). The joint probability distribution function of (u, v) satisfies the Fokker-Planck equation (Chu, 2008, 2009), with the Weibull distribution as the solution for the current speed. To prove it, the PDF of the upper (0-50 m) tropical Pacific current speeds (w) was calculated from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project. In fact, it satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies. Besides, the Weibull distribution is also identified from the

  19. On the q-type distributions

    NASA Astrophysics Data System (ADS)

    Nadarajah, Saralees; Kotz, Samuel

    2007-04-01

    Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236

  20. Statistical analysis of multilook polarimetric SAR data and terrain classification with adaptive distribution

    NASA Astrophysics Data System (ADS)

    Liu, Guoqing; Huang, ShunJi; Torre, Andrea; Rubertone, Franco S.

    1995-11-01

    This paper deals with analysis of statistical properties of multi-look processed polarimetric SAR data. Based on an assumption that the multi-look polarimetric measurement is a product between a Gamma-distributed texture variable and a Wishart-distributed polarimetric speckle variable, it is shown that the multi-look polarimetric measurement from a nonhomogeneous region obeys a generalized K-distribution. In order to validate this statistical model, two of its varied versions, multi-look intensity and amplitude K-distributions are particularly compared with histograms of the observed multi-look SAR data of three terrain types, ocean, forest-like and city regions, and with four empirical distribution models, Gaussian, log-normal, gamma and Weibull models. A qualitative relation between the degree of nonhomogeneity of a textured scene and the well-fitting statistical model is then empirically established. Finally, a classifier with adaptive distributions guided by the order parameter of the texture distribution estimated with local statistics is introduced to perform terrain classification, experimental results with both multi-look fully polarimetric data and multi-look single-channel intensity/amplitude data indicate its effectiveness.

  1. Spatial and Temporal Patterns of Global Onshore Wind Speed Distribution

    SciTech Connect

    Zhou, Yuyu; Smith, Steven J.

    2013-09-09

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/CFSR reanalysis data. The estimated Weibull distribution performs well in fitting the time series wind speed data at the global level according to R2, root mean square error, and power density error. The spatial, decadal, and seasonal patterns of wind speed distribution were then evaluated. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in substantial errors. While large-scale wind speed data is often presented in the form of average wind speeds, these results highlight the need to also provide information on the wind speed distribution.

  2. Distribution-free discriminant analysis

    SciTech Connect

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  3. Average BER of subcarrier intensity modulated free space optical systems over the exponentiated Weibull fading channels.

    PubMed

    Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang

    2014-08-25

    The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems.

  4. Weibull models of fracture strengths and fatigue behavior of dental resins in flexure and shear.

    PubMed

    Baran, G R; McCool, J I; Paul, D; Boberick, K; Wunder, S

    1998-01-01

    In estimating lifetimes of dental restorative materials, it is useful to have available data on the fatigue behavior of these materials. Current efforts at estimation include several untested assumptions related to the equivalence of flaw distributions sampled by shear, tensile, and compressive stresses. Environmental influences on material properties are not accounted for, and it is unclear if fatigue limits exist. In this study, the shear and flexural strengths of three resins used as matrices in dental restorative composite materials were characterized by Weibull parameters. It was found that shear strengths were lower than flexural strengths, liquid sorption had a profound effect on characteristic strengths, and the Weibull shape parameter obtained from shear data differed for some materials from that obtained in flexure. In shear and flexural fatigue, a power law relationship applied for up to 250,000 cycles; no fatigue limits were found, and the data thus imply only one flaw population is responsible for failure. Again, liquid sorption adversely affected strength levels in most materials (decreasing shear strengths and flexural strengths by factors of 2-3) and to a greater extent than did the degree of cure or material chemistry.

  5. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  6. Validity of using average diameter for determination of tensile strength and Weibull modulus of ceramic filaments

    SciTech Connect

    Petry, M.D.; Mah, T.I.; Kerans, R.J.

    1997-10-01

    Strengths and Weibull moduli for alumina/yttrium aluminum garnet eutectic (AYE) filaments and for Si-C-O (Nicalon) filaments were calculated using measured and average filament diameters. The strengths agreed closely. Thus an average filament diameter could be used instead of the measured filament diameter in calculating strengths. The Weibull modulus obtained from an average filament diameter approximates the Weibull modulus obtained using the measured filament diameter.

  7. Statistical modeling of tornado intensity distributions

    NASA Astrophysics Data System (ADS)

    Dotzek, Nikolai; Grieser, Jürgen; Brooks, Harold E.

    We address the issue to determine an appropriate general functional shape of observed tornado intensity distributions. Recently, it was suggested that in the limit of long and large tornado records, exponential distributions over all positive Fujita or TORRO scale classes would result. Yet, our analysis shows that even for large databases observations contradict the validity of exponential distributions for weak (F0) and violent (F5) tornadoes. We show that observed tornado intensities can be much better described by Weibull distributions, for which an exponential remains a special case. Weibull fits in either v or F scale reproduce the observations significantly better than exponentials. In addition, we suggest to apply the original definition of negative intensity scales down to F-2 and T-4 (corresponding to v=0 m s -1) at least for climatological analyses. Weibull distributions allow for an improved risk assessment of violent tornadoes up to F6, and better estimates of total tornado occurrence, degree of underreporting and existence of subcritical tornadic circulations below damaging intensity. Therefore, our results are relevant for climatologists and risk assessment managers alike.

  8. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  9. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2008-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  10. A Bayesian Weibull survival model for time to infection data measured with delay.

    PubMed

    Kostoulas, Polychronis; Nielsen, Søren S; Browne, William J; Leontides, Leonidas

    2010-05-01

    Survival analysis methods can be used to identify factors associated with the time to induction of infection. In the absence of a perfect test, detection of infection is generally delayed and depends on the duration of the latent infection period. We assess, via simulations, the impact of ignoring the delayed detection of infection on estimated survival times and propose a Bayesian Weibull regression model, which adjusts for the delayed detection of infection. The presence of non-differential detection delay seriously biased the baseline hazard and the shape of the hazard function. For differential detection delay, the associated regression coefficients were also biased. The extent of bias largely depended on the longevity of the delay. In all considered simulation scenarios our model led to corrected estimates. We utilized the proposed model in order to assess the age at natural infection with Mycobacterium avium subsp. paratuberculosis (MAP) in Danish dairy cattle from the analysis of available time to milk-seropositivity data that detected infection with delay. The proposed model captured the inverse relationship between the incidence rate of infection and that of seroconversion with time: susceptibility to infection decreases with time (shape parameter under the proposed model was rho=0.56<1), while older animals had a higher probability of sero-converting (rho=2.67>1, under standard Weibull regression). Cows infected earlier in their lives were more likely to subsequently shed detectable levels of MAP and, hence, be a liability to herd-mates. Our approach can be particularly useful in the case of chronic infections with a long latent infection period, which, if ignored, severely affects survival estimates.

  11. Power and Sample Size for Randomized Phase III Survival Trials under the Weibull Model

    PubMed Central

    Wu, Jianrong

    2015-01-01

    Two parametric tests are proposed for designing randomized two-arm phase III survival trials under the Weibull model. The properties of the two parametric tests are compared with the non-parametric log-rank test through simulation studies. Power and sample size formulas of the two parametric tests are derived. The impact on sample size under mis-specification of the Weibull shape parameter is also investigated. The study can be designed by planning the study duration and handling nonuniform entry and loss to follow-up under the Weibull model using either the proposed parametric tests or the well known non-parametric log-rank test. PMID:24895942

  12. Brain responses strongly correlate with Weibull image statistics when processing natural images.

    PubMed

    Scholte, H Steven; Ghebreab, Sennay; Waldorp, Lourens; Smeulders, Arnold W M; Lamme, Victor A F

    2009-01-01

    The visual appearance of natural scenes is governed by a surprisingly simple hidden structure. The distributions of contrast values in natural images generally follow a Weibull distribution, with beta and gamma as free parameters. Beta and gamma seem to structure the space of natural images in an ecologically meaningful way, in particular with respect to the fragmentation and texture similarity within an image. Since it is often assumed that the brain exploits structural regularities in natural image statistics to efficiently encode and analyze visual input, we here ask ourselves whether the brain approximates the beta and gamma values underlying the contrast distributions of natural images. We present a model that shows that beta and gamma can be easily estimated from the outputs of X-cells and Y-cells. In addition, we covaried the EEG responses of subjects viewing natural images with the beta and gamma values of those images. We show that beta and gamma explain up to 71% of the variance of the early ERP signal, substantially outperforming other tested contrast measurements. This suggests that the brain is strongly tuned to the image's beta and gamma values, potentially providing the visual system with an efficient way to rapidly classify incoming images on the basis of omnipresent low-level natural image statistics. PMID:19757938

  13. Effects of dislocation density and sample-size on plastic yielding at the nanoscale: a Weibull-like framework.

    PubMed

    Rinaldi, Antonio

    2011-11-01

    Micro-compression tests have demonstrated that plastic yielding in nanoscale pillars is the result of the fine interplay between the sample-size (chiefly the diameter D) and the density of bulk dislocations ρ. The power-law scaling typical of the nanoscale stems from a source-limited regime, which depends on both these sample parameters. Based on the experimental and theoretical results available in the literature, this paper offers a perspective about the joint effect of D and ρ on the yield stress in any plastic regime, promoting also a schematic graphical map of it. In the sample-size dependent regime, such dependence is cast mathematically into a first order Weibull-type theory, where the power-law scaling the power exponent β and the modulus m of an approximate (unimodal) Weibull distribution of source-strengths can be related by a simple inverse proportionality. As a corollary, the scaling exponent β may not be a universal number, as speculated in the literature. In this context, the discussion opens the alternative possibility of more general (multimodal) source-strength distributions, which could produce more complex and realistic strengthening patterns than the single power-law usually assumed. The paper re-examines our own experimental data, as well as results of Bei et al. (2008) on Mo-alloy pillars, especially for the sake of emphasizing the significance of a sudden increase in sample response scatter as a warning signal of an incipient source-limited regime.

  14. Effects of dislocation density and sample-size on plastic yielding at the nanoscale: a Weibull-like framework

    NASA Astrophysics Data System (ADS)

    Rinaldi, Antonio

    2011-11-01

    Micro-compression tests have demonstrated that plastic yielding in nanoscale pillars is the result of the fine interplay between the sample-size (chiefly the diameter D) and the density of bulk dislocations ρ. The power-law scaling typical of the nanoscale stems from a source-limited regime, which depends on both these sample parameters. Based on the experimental and theoretical results available in the literature, this paper offers a perspective about the joint effect of D and ρ on the yield stress in any plastic regime, promoting also a schematic graphical map of it. In the sample-size dependent regime, such dependence is cast mathematically into a first order Weibull-type theory, where the power-law scaling the power exponent β and the modulus m of an approximate (unimodal) Weibull distribution of source-strengths can be related by a simple inverse proportionality. As a corollary, the scaling exponent β may not be a universal number, as speculated in the literature. In this context, the discussion opens the alternative possibility of more general (multimodal) source-strength distributions, which could produce more complex and realistic strengthening patterns than the single power-law usually assumed. The paper re-examines our own experimental data, as well as results of Bei et al. (2008) on Mo-alloy pillars, especially for the sake of emphasizing the significance of a sudden increase in sample response scatter as a warning signal of an incipient source-limited regime.

  15. DASH---Distributed Analysis System Hierarchy

    NASA Astrophysics Data System (ADS)

    Yagi, M.; Mizumoto, Y.; Yoshida, M.; Kosugi, G.; Takata, T.; Ogasawara, R.; Ishihara, Y.; Morita, Y.; Nakamoto, H.; Watanabe, N.

    We developed the Distributed Analysis Software Hierarchy (DASH), an object-oriented data reduction and data analysis system for efficient processing of data from the SUBARU telescope. DASH consists of many objects (data management objects, reduction engines, GUIs, etc.) distributed on CORBA. We have also developed SASH, a stand-alone system which has the same interface as DASH, but which does not use some of the distributed services such as DA/DB; visiting astronomers can detach PROCube out of DASH and continue the analysis with SASH at their home institute. SASH will be used as a quick reduction tool at the summit.

  16. Group Sequential Design for Randomized Phase III Trials under the Weibull Model

    PubMed Central

    Wu, Jianrong; Xiong, Xiaoping

    2014-01-01

    In this paper, a parametric sequential test is proposed under the Weibull model. The proposed test is asymptotically normal with an independent increments structure. The sample size for fixed sample test is derived for the purpose of group sequential trial design. In addition, a multi-stage group sequential procedure is given under the Weibull model by applying the Brownian motion property of the test statistic and sequential conditional probability ratio test methodology. PMID:25322440

  17. On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios

    2013-04-01

    The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006

  18. Analysis of distribution of critical current of bent-damaged Bi2223 composite tape

    NASA Astrophysics Data System (ADS)

    Ochiai, S.; Okuda, H.; Sugano, M.; Hojo, M.; Osamura, K.; Kuroda, T.; Kumakura, H.; Kitaguchi, H.; Itoh, K.; Wada, H.

    2011-10-01

    Distributions of critical current of damaged Bi2223 tape specimens bent by 0.6, 0.8 and 1.0% were investigated analytically with a modelling approach based on the correlation of damage evolution to distribution of critical current. It was revealed that the distribution of critical current is described by three parameter Weibull distribution function through the distribution of the tensile damage strain of Bi2223 filaments that determines the damage front in bent-composite tape. Also it was shown that the measured distribution of critical current values can be reproduced successfully by a Monte Carlo simulation using the distributions of tensile damage strain of filaments and original critical current.

  19. Towards Distributed Memory Parallel Program Analysis

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2008-06-17

    This paper presents a parallel attribute evaluation for distributed memory parallel computer architectures where previously only shared memory parallel support for this technique has been developed. Attribute evaluation is a part of how attribute grammars are used for program analysis within modern compilers. Within this work, we have extended ROSE, a open compiler infrastructure, with a distributed memory parallel attribute evaluation mechanism to support user defined global program analysis required for some forms of security analysis which can not be addressed by a file by file view of large scale applications. As a result, user defined security analyses may now run in parallel without the user having to specify the way data is communicated between processors. The automation of communication enables an extensible open-source parallel program analysis infrastructure.

  20. Can Satellite Sampling of Offshore Wind Speeds Realistically Represent Wind Speed Distributions? Part II: Quantifying Uncertainties Associated with Distribution Fitting Methods.

    NASA Astrophysics Data System (ADS)

    Pryor, S. C.; Nielsen, M.; Barthelmie, R. J.; Mann, J.

    2004-05-01

    Remote sensing tools represent an attractive proposition for measuring wind speeds over the oceans because, in principle, they also offer a mechanism for determining the spatial variability of flow. Presented here is the continuation of research focused on the uncertainties and biases currently present in these data and quantification of the number of independent observations (scenes) required to characterize various parameters of the probability distribution of wind speeds. Theoretical and empirical estimates are derived of the critical number of independent observations (wind speeds derived from analysis of remotely sensed scenes) required to obtain probability distribution parameters with an uncertainty of ±10% and a confidence level of 90% under the assumption of independent samples, and it is found that approximately 250 independent observations are required to fit the Weibull distribution parameters. Also presented is an evaluation of Weibull fitting methods and determination of the fitting method based on the first and third moments to exhibit the “best” performance for pure Weibull distributions. Further examined is the ability to generalize parameter uncertainty bounds presented previously by Barthelmie and Pryor for distribution parameter estimates from sparse datasets; these were found to be robust and hence generally applicable to remotely sensed wind speed data series.


  1. Shuttle Electrical Power Analysis Program (SEPAP) distribution circuit analysis report

    NASA Technical Reports Server (NTRS)

    Torina, E. M.

    1975-01-01

    An analysis and evaluation was made of the operating parameters of the shuttle electrical power distribution circuit under load conditions encountered during a normal Sortie 2 Mission with emphasis on main periods of liftoff and landing.

  2. CRAB: Distributed analysis tool for CMS

    NASA Astrophysics Data System (ADS)

    Sala, Leonardo; CMS Collaboration

    2012-12-01

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

  3. Two-parameter logistic and Weibull equations provide better fits to survival data from isogenic populations of Caenorhabditis elegans in axenic culture than does the Gompertz model.

    PubMed

    Vanfleteren, J R; De Vreese, A; Braeckman, B P

    1998-11-01

    We have fitted Gompertz, Weibull, and two- and three-parameter logistic equations to survival data obtained from 77 cohorts of Caenorhabditis elegans in axenic culture. Statistical analysis showed that the fitting ability was in the order: three-parameter logistic > two-parameter logistic = Weibull > Gompertz. Pooled data were better fit by the logistic equations, which tended to perform equally well as population size increased, suggesting that the third parameter is likely to be biologically irrelevant. Considering restraints imposed by the small population sizes used, we simply conclude that the two-parameter logistic and Weibull mortality models for axenically grown C. elegans generally provided good fits to the data, whereas the Gompertz model was inappropriate in many cases. The survival curves of several short- and long-lived mutant strains could be predicted by adjusting only the logistic curve parameter that defines mean life span. We conclude that life expectancy is genetically determined; the life span-altering mutations reported in this study define a novel mean life span, but do not appear to fundamentally alter the aging process.

  4. Comparison of the Weibull characteristics of hydroxyapatite and strontium doped hydroxyapatite.

    PubMed

    Yatongchai, Chokchai; Wren, Anthony W; Curran, Declan J; Hornez, Jean-Christophe; Mark R, Towler

    2013-05-01

    The effects of two strontium (Sr) additions, 5% and 10% of the total calcium (Ca) content, on the phase assemblage and Weibull statistics of hydroxyapatite (HA) are investigated and compared to those of undoped HA. Sintering was carried out in the range of 900-1200 °C in steps of 1000 °C in a conventional furnace. Sr content had little effect on the mean particulate size. Decomposition of the HA phase occurred with Sr incorporation, while β-TCP stabilization was shown to occur with 10% Sr additions. Porosity in both sets of doped samples was at a comparable level to porosity in the undoped HA samples, however the 5% Sr-HA samples displayed the greatest reduction in porosity with increasing temperature while the porosity of the 10% Sr-HA samples remain relatively constant over the full sintering temperature range. The undoped HA samples displayed the greatest Weibull strengths and the porosity was determined to be the major controlling factor. However, with the introduction of decompositional phases in the Sr-HA samples, the dependence of strength on porosity is reduced and the phase assemblage becomes the more dominant factor for Weibull strength. The Weibull modulus is relatively independent of the porosity in the undoped HA samples. The 5% Sr-HA samples experience a slight increase in Weibull modulus with porosity, indicating a possible relationship between the parameters. However the 10% Sr-HA samples show the highest Weibull modulus with a value of approximately 15 across all sintering temperatures. It is postulated that this is due to the increased amount of surface and lattice diffusion that these samples undergo, which effectively smooths out flaws in the microstructure, due to a saturation of Sr content occurring in grain boundary movement. PMID:23524073

  5. Complexity Analysis of Peat Soil Density Distribution

    NASA Astrophysics Data System (ADS)

    Sampurno, Joko; Diah Faryuni, Irfana; Dzar Eljabbar Latief, Fourier; Srigutomo, Wahyu

    2016-08-01

    The distributions of peat soil density have been identified using fractal analysis method. The study was conducted on 5 peat soil samples taken from a ground field in Pontianak, West Kalimantan, at the coordinates (0 ° 4 '2:27 "S, 109 ° 18' 48.59" E). In this study, we used micro computerized tomography (pCT Scanner) at 9.41 micro meter per pixel resolution under peat soil samples to provide 2-D high-resolution images L1-L5 (200 200 pixels) that were used to detect the distribution of peat soil density. The method for determining the fractal dimension and intercept was the 2-D Fourier analysis method. The method was used to obtain the log log-plot of magnitude with frequency. Fractal dimension was obtained from the straight regression line that interpolated the points in the interval with the largest coefficient determination. Intercept defined by the point of intersection on the -axis. The conclusion was that the distributions of peat soil density showing the fractal behaviour with the heterogeneity of the samples from the highest to the lowest were L5, L1, L4, L3 and L2. Meanwhile, the range of density values of the samples from the highest to the lowest was L3, L2, L4, L5 and L1. The study also concluded that the behaviour of the distribution of peat soil density was a weakly anisotropic.

  6. Distributed analysis in ATLAS using GANGA

    NASA Astrophysics Data System (ADS)

    Elmsheuser, Johannes; Brochu, Frederic; Cowan, Greig; Egede, Ulrik; Gaidioz, Benjamin; Lee, Hurng-Chun; Maier, Andrew; Móscicki, Jakub; Pajchel, Katarina; Reece, Will; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Vanderster, Daniel; Williams, Michael

    2010-04-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  7. Analysis and Modelling of Extreme Wind Speed Distributions in Complex Mountainous Regions

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Kanevski, Mikhail

    2016-04-01

    Modelling of wind speed distributions in complex mountainous regions is an important and challenging problem which interests many scientists from several fields. In the present research, high frequency (10 min) Swiss wind speed monitoring data (IDAWEB service, Meteosuisse) are analysed and modelled with different parametric distributions (Weibull, GEV, Gamma, etc.) using maximum likelihood method. In total, 111 stations placed in different geomorphological units and at different altitude (from 203 to 3580 meters) are studied. Then, this information is used for training machine learning algorithms (Extreme Learning Machines, Support vector machine) to predict the distribution at new places, potentially useful for aeolian energy generation. An important part of the research deals with the construction and application of a high dimensional input feature space, generated from digital elevation model. A comprehensive study was carried out using feature selection approach to get the best model for the prediction. The main results are presented as spatial patterns of distributions' parameters.

  8. Offshore wind resource assessment with Standard Wind Analysis Tool (SWAT): A Rhode Island case study

    NASA Astrophysics Data System (ADS)

    Crosby, Alexander Robert

    Motivated by the current Rhode Island Ocean SAMP (Special Area Management Plan) project and the growing need in the foreseeable future, analysis tools for wind resource assessment are assembled into a toolkit that can be accessed from a GIS. The analysis is demonstrated by application to the ongoing wind resource assessment of Rhode Island's offshore waters by the Ocean SAMP. The tool is called Standard Wind Analysis Tool (SWAT). SWAT utilizes a method for integrating observations from the study area or numerical model outputs to assemble the spatial distribution of the offshore wind resource. Available power is inferred from direct measurements of wind speed, but the shape of the atmospheric boundary layer or wind speed profile must be parameterized in order to extrapolate measurements to heights other than that of the measurements. The vertical wind speed profile is modeled with the basic power law assuming a 1/7 exponent parameter representing near-neutral or more accurately timeaverage conditions. As an alternate estimate from year long multi-level observations at a meteorological tower is employed. The basis for the power analysis is the 2- parameter Weibull probability distribution, recognized as standard in modeling typical wind speed distributions. A Monte-Carlo simulation of the Weibull probability density function provides the expected power densities at observation sites. Application to Rhode Island's coastal waters yields an estimated Weibull shape parameter of roughly 2 for the offshore environment and a Weibull scale parameter that increases with distance from the coast. Estimates of power in the SAMP study area range from 525 to 850 W/m² at an elevation of 80 meters based on an observed profile in the SAMP study area. Like the Weibull scale parameter, annual mean wind power increases with distance offshore.

  9. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  10. Time-resolved force distribution analysis

    PubMed Central

    2013-01-01

    Background Biomolecules or other complex macromolecules undergo conformational transitions upon exposure to an external perturbation such as ligand binding or mechanical force. To follow fluctuations in pairwise forces between atoms or residues during such conformational changes as observed in Molecular Dynamics (MD) simulations, we developed Time-Resolved Force Distribution Analysis (TRFDA). Results The implementation focuses on computational efficiency and low-memory usage and, along with the wide range of output options, makes possible time series analysis of pairwise forces variation in long MD simulations and for large molecular systems. It also provides an exact decomposition of pairwise forces resulting from 3- and 4-body potentials and a unified treatment of pairwise forces between atoms or residues. As a proof of concept, we present a stress analysis during unfolding of ubiquitin in a force-clamp MD simulation. Conclusions TRFDA can be used, among others, in tracking signal propagation at atomic level, for characterizing dynamical intermolecular interactions (e.g. protein-ligand during flexible docking), in development of force fields and for following stress distribution during conformational changes. PMID:24499624

  11. Analysis of Jingdong Mall Logistics Distribution Model

    NASA Astrophysics Data System (ADS)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  12. DASH--distributed analysis system hierarchy

    NASA Astrophysics Data System (ADS)

    Yagi, Masafumi; Yoshihiko, Mizumoto; Ogasawara, Ryusuke; Kosugi, George; Takata, Tadafumi; Ishihara, Yasuhide; Yokono, Yasunori; Morita, Yasuhiro; Nakamoto, Hiroyuki; Watanabe, Noboru; Ukawa, Kentaro

    2002-12-01

    We have developed and are operating an object-oriented data reduction and data analysis system, DASH ( Distributed Analysis Software Hierarchy ), for efficient data processing for SUBARU telescope. In DASH, all information for reducing a set of data is packed into an abstracted object, named as ``Hierarchy''. It contains rules how to search calibration data, reduction procedure to the final result, and also the reduction log. With Hierarchy, DASH works as an automated reduction pipeline platform cooperated with STARS (Subaru Telescope ARchive System). DASH is implemented on CORBA and Java technology. The portability of these technology enables us to make a subset of the system for a small stand-alone system, SASH. SASH is compatible with DASH and one can continuously reduce and analyze data between DASH and SASH.

  13. Analysis and control of distributed cooperative systems.

    SciTech Connect

    Feddema, John Todd; Parker, Eric Paul; Wagner, John S.; Schoenwald, David Alan

    2004-09-01

    As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

  14. Global analysis of nuclear parton distributions

    NASA Astrophysics Data System (ADS)

    de Florian, Daniel; Sassot, Rodolfo; Zurita, Pia; Stratmann, Marco

    2012-04-01

    We present a new global QCD analysis of nuclear parton distribution functions and their uncertainties. In addition to the most commonly analyzed data sets for the deep-inelastic scattering of charged leptons off nuclei and Drell-Yan dilepton production, we include also measurements for neutrino-nucleus scattering and inclusive pion production in deuteron-gold collisions. The analysis is performed at next-to-leading order accuracy in perturbative QCD in a general mass variable flavor number scheme, adopting a current set of free nucleon parton distribution functions, defined accordingly, as reference. The emerging picture is one of consistency, where universal nuclear modification factors for each parton flavor reproduce the main features of all data without any significant tension among the different sets. We use the Hessian method to estimate the uncertainties of the obtained nuclear modification factors and examine critically their range of validity in view of the sparse kinematic coverage of the present data. We briefly present several applications of our nuclear parton densities in hard nuclear reactions at BNL-RHIC, CERN-LHC, and a future electron-ion collider.

  15. Elemental distribution analysis of urinary crystals.

    PubMed

    Fazil Marickar, Y M; Lekshmi, P R; Varma, Luxmi; Koshy, Peter

    2009-10-01

    Various crystals are seen in human urine. Some of them, particularly calcium oxalate dihydrate, are seen normally. Pathological crystals indicate crystal formation initiating urinary stones. Unfortunately, many of the relevant crystals are not recognized in light microscopic analysis of the urinary deposit performed in most of the clinical laboratories. Many crystals are not clearly identifiable under the ordinary light microscopy. The objective of the present study was to perform scanning electron microscopic (SEM) assessment of various urinary deposits and confirm the identity by elemental distribution analysis (EDAX). 50 samples of urinary deposits were collected from urinary stone clinic. Deposits containing significant crystalluria (more than 10 per HPF) were collected under liquid paraffin in special containers and taken up for SEM studies. The deposited crystals were retrieved with appropriate Pasteur pipettes, and placed on micropore filter paper discs. The fluid was absorbed by thicker layers of filter paper underneath and discs were fixed to brass studs. They were then gold sputtered to 100 A and examined under SEM (Jeol JSM 35C microscope). When crystals were seen, their morphology was recorded by taking photographs at different angles. At appropriate magnification, EDAX probe was pointed to the crystals under study and the wave patterns analyzed. Components of the crystals were recognized by utilizing the data. All the samples analyzed contained significant number of crystals. All samples contained more than one type of crystal. The commonest crystals encountered included calcium oxalate monohydrate (whewellite 22%), calcium oxalate dihydrate (weddellite 32%), uric acid (10%), calcium phosphates, namely, apatite (4%), brushite (6%), struvite (6%) and octocalcium phosphate (2%). The morphological appearances of urinary crystals described were correlated with the wavelengths obtained through elemental distribution analysis. Various urinary crystals that

  16. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  17. CMS distributed data analysis with CRAB3

    DOE PAGESBeta

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; et al

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less

  18. CMS distributed data analysis with CRAB3

    SciTech Connect

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  19. CMS distributed data analysis with CRAB3

    NASA Astrophysics Data System (ADS)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  20. Analysis of stratocumulus cloud fields using LANDSAT imagery: Size distributions and spatial separations

    NASA Technical Reports Server (NTRS)

    Welch, R. M.; Sengupta, S. K.; Chen, D. W.

    1990-01-01

    Stratocumulus cloud fields in the FIRE IFO region are analyzed using LANDSAT Thematic Mapper imagery. Structural properties such as cloud cell size distribution, cell horizontal aspect ratio, fractional coverage and fractal dimension are determined. It is found that stratocumulus cloud number densities are represented by a power law. Cell horizontal aspect ratio has a tendency to increase at large cell sizes, and cells are bi-fractal in nature. Using LANDSAT Multispectral Scanner imagery for twelve selected stratocumulus scenes acquired during previous years, similar structural characteristics are obtained. Cloud field spatial organization also is analyzed. Nearest-neighbor spacings are fit with a number of functions, with Weibull and Gamma distributions providing the best fits. Poisson tests show that the spatial separations are not random. Second order statistics are used to examine clustering.

  1. Buffered Communication Analysis in Distributed Multiparty Sessions

    NASA Astrophysics Data System (ADS)

    Deniélou, Pierre-Malo; Yoshida, Nobuko

    Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

  2. A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes

    PubMed Central

    2015-01-01

    Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222

  3. Sensitivity analysis of distributed volcanic source inversion

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  4. Distributed Design and Analysis of Computer Experiments

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  5. Periodic clustering of human disease-specific mortality distributions by shape and time position, and a new integer-based law of mortality.

    PubMed

    Juckett, D A; Rosenberg, B

    1990-09-01

    Human mortality distributions were analyzed for 29 disease-specific causes-of-death in male and female, White (U.S.A.), Black (U.S.A.) and Japanese (Japan) populations, constituting a total of 162 separate cohorts. For each cohort distribution, the curve moments and the parameter values for fits to model equations were determined. The differences between cohort distributions were characterized by two degrees of freedom, related to distribution position and shape, respectively. A form of the Weibull function was shown to contain two parameters that mapped to these two degrees of freedom. Parametric analysis on 136 best-fitting cohorts yielded periodic clustering in the set of values for both Weibull parameters as quantitated using a Fourier transform method and an independent statistical method. This periodicity was unlikely to have occurred by chance (P less than 0.01). We have combined these results into a Law of Mortality, based on a Weibull function containing only integer parameters and constants, which is valid for all human age-related disease mortality. We show that the life expectancy differences between races and sexes is completely described by this formalism. We conclude that human mortality is controlled by discrete events, which are manifested in the appearance of only allowed mortality curve shapes and positions.

  6. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the Dist

  7. Survival Analysis of Patients with End Stage Renal Disease

    NASA Astrophysics Data System (ADS)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  8. Dentin bonding performance using Weibull statistics and evaluation of acid-base resistant zone formation of recently introduced adhesives.

    PubMed

    Guan, Rui; Takagaki, Tomohiro; Matsui, Naoko; Sato, Takaaki; Burrow, Michael F; Palamara, Joseph; Nikaido, Toru; Tagami, Junji

    2016-07-30

    Dentin bonding durability of recently introduced dental adhesives: Clearfil SE Bond 2 (SE2), Optibond XTR (XTR), and Scotchbond Universal (SBU) was investigated using Weibull analysis as well as analysis of the micromorphological features of the acid-base resistant zone (ABRZ) created for the adhesives. The bonding procedures of SBU were divided into three subgroups: self-etch (SBS), phosphoric acid (PA) etching on moist (SBM) or dry dentin (SBD). All groups were thermocycled for 0, 5,000 and 10,000 cycles followed by microtensile bond strength testing. Acid-base challenge was undertaken before SEM and TEM observations of the adhesive interface. The etch-and-rinse method with SBU (SBM and SBD) created inferior interfaces on the dentin surface which resulted in reduced bond durability. ABRZ formation was detected with the self-etch adhesive systems; SE2, XTR and SBS. In the PA etching protocols of SBM and SBD, a thick hybrid layer but no ABRZ was detected, which might affect dentin bond durability. PMID:27335136

  9. Analysis of Temperature Distributions in Nighttime Inversions

    NASA Astrophysics Data System (ADS)

    Telyak, Oksana; Krasouski, Aliaksandr; Svetashev, Alexander; Turishev, Leonid; Barodka, Siarhei

    2015-04-01

    Adequate prediction of temperature inversion in the atmospheric boundary layer is one of prerequisites for successful forecasting of meteorological parameters and severe weather events. Examples include surface air temperature and precipitation forecasting as well as prediction of fog, frosts and smog with hazardous levels of atmospheric pollution. At the same time, reliable forecasting of temperature inversions remains an unsolved problem. For prediction of nighttime inversions over some specific territory, it is important to study characteristic features of local circulation cells formation and to properly take local factors into account to develop custom modeling techniques for operational use. The present study aims to investigate and analyze vertical temperature distributions in tropospheric inversions (isotherms) over the territory of Belarus. We study several specific cases of formation, evolution and decay of deep nighttime temperature inversions in Belarus by means of mesoscale numerical simulations with WRF model, considering basic mechanisms of isothermal and inverse temperature layers formation in the troposphere and impact of these layers on local circulation cells. Our primary goal is to assess the feasibility of advance prediction of inversions formation with WRF. Modeling results reveal that all cases under consideration have characteristic features of radiative inversions (e.g., their formation times, development phases, inversion intensities, etc). Regions of "blocking" layers formation are extensive and often spread over the entire territory of Belarus. Inversions decay starts from the lowermost (near surface) layer (altitudes of 5 to 50 m). In all cases, one can observe formation of temperature gradients that substantially differ from the basic inversion gradient, i.e. the layer splits into smaller layers, each having a different temperature stratification (isothermal, adiabatic, etc). As opposed to various empirical techniques as well as

  10. Harmonic analysis of electrical distribution systems

    SciTech Connect

    1996-03-01

    This report presents data pertaining to research on harmonics of electric power distribution systems. Harmonic data is presented on RMS and average measurements for determination of harmonics in buildings; fluorescent ballast; variable frequency drive; georator geosine harmonic data; uninterruptible power supply; delta-wye transformer; westinghouse suresine; liebert datawave; and active injection mode filter data.

  11. Integer sparse distributed memory: analysis and results.

    PubMed

    Snaider, Javier; Franklin, Stan; Strain, Steve; George, E Olusegun

    2013-10-01

    Sparse distributed memory is an auto-associative memory system that stores high dimensional Boolean vectors. Here we present an extension of the original SDM, the Integer SDM that uses modular arithmetic integer vectors rather than binary vectors. This extension preserves many of the desirable properties of the original SDM: auto-associativity, content addressability, distributed storage, and robustness over noisy inputs. In addition, it improves the representation capabilities of the memory and is more robust over normalization. It can also be extended to support forgetting and reliable sequence storage. We performed several simulations that test the noise robustness property and capacity of the memory. Theoretical analyses of the memory's fidelity and capacity are also presented. PMID:23747569

  12. Economic analysis of efficient distribution transformer trends

    SciTech Connect

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  13. Performance of free-space optical communication system using differential phase-shift keying subcarrier-intensity modulated over the exponentiated Weibull channel

    NASA Astrophysics Data System (ADS)

    Gao, Zhengguang; Liu, Hongzhan; Liao, Renbo; Ma, Xiaoping

    2015-10-01

    A differential phase-shift keying modulation for free-space optical (FSO) communication is considered in atmospheric turbulence modeled by the exponentiated Weibull distribution. The selection combining (SelC) spatial diversity is used to mitigate the effects of atmospheric turbulence. We analyze the average bit error rate (BER) of the system using SelC spatial diversity by Gauss-Laguerre approximation. The effect of aperture averaging and spatial diversity on the outage probability is also studied. The numerical results show that it requires a smaller level of signal-to-noise ratio to reach the same BER when large aperture and SelC spatial diversity are deployed in the FSO system. Moreover, it is proved that aperture averaging and SelC spatial diversity are effective for improving the performance of the system's outage probability.

  14. Progressive failure site generation in AlGaN/GaN high electron mobility transistors under OFF-state stress: Weibull statistics and temperature dependence

    SciTech Connect

    Sun, Huarui Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin

    2015-01-26

    Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.

  15. Equity analysis of hospital beds distribution in Shiraz, Iran 2014

    PubMed Central

    Hatam, Nahid; Zakeri, Mohammadreza; Sadeghi, Ahmad; Darzi Ramandi, Sajad; Hayati, Ramin; Siavashi, Elham

    2016-01-01

    Background: One of the important aspects of equity in health is equality in the distribution of resources in this sector. The present study aimed to assess the distribution of hospital beds in Shiraz in 2014. Methods: In this retrospective cross-sectional study, the population density index and fair distribution of beds were analyzed by Lorenz curve and Gini coefficient, respectively. Descriptive data were analyzed using Excel software. We used Distributive Analysis Stata Package (DASP) in STATA software, version 12, for computing Gini coefficient and drawing Lorenz curve. Results: The Gini coefficient was 0.68 in the population. Besides, Gini coefficient of hospital beds’ distribution based on population density was 0.70, which represented inequality in the distribution of hospital bedsamong the nine regions of Shiraz. Conclusion: Although the total number of hospital beds was reasonable in Shiraz, distribution of these resources was not fair, and inequality was observed in their distribution among the nine regions of Shiraz. PMID:27579284

  16. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    ERIC Educational Resources Information Center

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  17. Analysis of Distribution Procedures Used by States to Distribute Federal Funds for Vocational Education.

    ERIC Educational Resources Information Center

    Benson, Charles S.; And Others

    An analysis of the procedures states have adopted to distribute federal funds for vocational education under the 1976 Amendments to the Vocational Education Act shows that there is widespread confusion and variation among the states. While the Act specifies that a formula must be used for distribution of funds, the exact criteria for determining…

  18. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  19. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  20. Distributed bearing fault diagnosis based on vibration analysis

    NASA Astrophysics Data System (ADS)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  1. Analysis of exposure biomarker relationships with the Johnson SBB distribution.

    PubMed

    Flynn, Michael R

    2007-08-01

    Application of the Johnson bivariate S(B) distribution, or alternatively the S(BB) distribution, is presented here as a tool for the analysis of concentration data and in particular for characterizing the relationship between exposures and biomarkers. Methods for fitting the marginal S(B) distributions are enhanced by maximizing the Shapiro-Wilk W statistic. The subsequent goodness of fit for the S(BB) distribution is evaluated with a multivariate Z statistic. Median regression results are extended here with methods for calculating the mean and standard deviation of the conditional array distributions. Application of these methods to the evaluation of the relationship between exposure to airborne bromopropane and the biomarker of serum bromide concentration suggests that the S(BB) distribution may be useful in stratifying workers by exposure based on using a biomarker. A comparison with the usual two-parameter log-normal approach shows that in some cases the S(BB) distribution may offer advantages.

  2. Charge distribution analysis of catalysts under simulated reaction conditions

    SciTech Connect

    Freund, F.

    1992-01-01

    Charge Distribution Analysis (CDA) is a technique for measuring mobile charge carriers in dielectric materials. CDA is based on dielectric polarization in an electric field gradient. The CDA apparatus is now under construction. 3 figs.

  3. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  4. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  5. Precipitator inlet particulate distribution flow analysis

    SciTech Connect

    LaRose, J.A.; Averill, A.

    1994-12-31

    The B and W Rothemuhle precipitators located at PacifiCorp`s Wyodak Generating Station in Gillette, Wyoming have, for the past two years, been experiencing discharge wire breakage. The breakage is due to corrosion of the wires: however, the exact cause of the corrosion is unknown. One aspect thought to contribute to the problem is an unbalance of ash loading among the four precipitators. Plant operation has revealed that the ash loading to precipitator C appears to be the heaviest of the four casing, and also appears to have the most severe corrosion. Data from field measurements showed that the gas flows to the four precipitators are fairly uniform, within {+-}9% of the average. The ash loading data showed a large maldistribution among the precipitators. Precipitator C receives 60% more ash than the next heaviest loaded precipitator. A numerical model was created which showed the same results. The model was then utilized to determine design modifications to the existing flue and turning vanes to improve the ash loading distribution. The resulting design was predicted to improve the ash loading to all the precipitators, within {+-}10% of the average.

  6. A Distributed, Parallel Visualization and Analysis Tool

    SciTech Connect

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three- dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.

  7. A Distributed, Parallel Visualization and Analysis Tool

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-more » dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.« less

  8. Modeling and analysis of solar distributed generation

    NASA Astrophysics Data System (ADS)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  9. Near field light intensity distribution analysis in bimodal polymer waveguide

    NASA Astrophysics Data System (ADS)

    Herzog, T.; Gut, K.

    2015-12-01

    The paper presents analysis of light intensity distribution and sensitivity in differential interferometer based on bimodal polymer waveguide. Key part is analysis of optimal waveguide layer thickness in structure SiO2/SU-8/H2O for maximum bulk refractive index sensitivity. The paper presents new approach to detecting phase difference between modes through registrations only part of energy propagating in the waveguide. Additionally in this paper the analysis of changes in light distribution when energy in modes is not equal were performed.

  10. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    NASA Astrophysics Data System (ADS)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  11. ANALYSIS OF DISTRIBUTION FEEDER LOSSES DUE TO ADDITION OF DISTRIBUTED PHOTOVOLTAIC GENERATORS

    SciTech Connect

    Tuffner, Francis K.; Singh, Ruchi

    2011-08-09

    Distributed generators (DG) are small scale power supplying sources owned by customers or utilities and scattered throughout the power system distribution network. Distributed generation can be both renewable and non-renewable. Addition of distributed generation is primarily to increase feeder capacity and to provide peak load reduction. However, this addition comes with several impacts on the distribution feeder. Several studies have shown that addition of DG leads to reduction of feeder loss. However, most of these studies have considered lumped load and distributed load models to analyze the effects on system losses, where the dynamic variation of load due to seasonal changes is ignored. It is very important for utilities to minimize the losses under all scenarios to decrease revenue losses, promote efficient asset utilization, and therefore, increase feeder capacity. This paper will investigate an IEEE 13-node feeder populated with photovoltaic generators on detailed residential houses with water heater, Heating Ventilation and Air conditioning (HVAC) units, lights, and other plug and convenience loads. An analysis of losses for different power system components, such as transformers, underground and overhead lines, and triplex lines, will be performed. The analysis will utilize different seasons and different solar penetration levels (15%, 30%).

  12. Adaptive walks and distribution of beneficial fitness effects.

    PubMed

    Seetharaman, Sarada; Jain, Kavita

    2014-04-01

    We study the adaptation dynamics of a maladapted asexual population on rugged fitness landscapes with many local fitness peaks. The distribution of beneficial fitness effects is assumed to belong to one of the three extreme value domains, viz. Weibull, Gumbel, and Fréchet. We work in the strong selection-weak mutation regime in which beneficial mutations fix sequentially, and the population performs an uphill walk on the fitness landscape until a local fitness peak is reached. A striking prediction of our analysis is that the fitness difference between successive steps follows a pattern of diminishing returns in the Weibull domain and accelerating returns in the Fréchet domain, as the initial fitness of the population is increased. These trends are found to be robust with respect to fitness correlations. We believe that this result can be exploited in experiments to determine the extreme value domain of the distribution of beneficial fitness effects. Our work here differs significantly from the previous ones that assume the selection coefficient to be small. On taking large effect mutations into account, we find that the length of the walk shows different qualitative trends from those derived using small selection coefficient approximation.

  13. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  14. Discriminating topology in galaxy distributions using network analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungryong; Coutinho, Bruno C.; Dey, Arjun; Barabási, Albert-L.; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl

    2016-07-01

    The large-scale distribution of galaxies is generally analysed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a Lévy walk. For the cosmological simulation, we adopt the redshift z = 0.58 slice from Illustris and select galaxies with stellar masses greater than 108 M⊙. The two-point correlation function of these simulated galaxies follows a single power law, ξ(r) ˜ r-1.5. Then, we generate Lévy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point distributions have the same abundance and two-point correlation function, their spatial distributions are very different; most prominently, filamentary structures, absent in Lévy fractals. To quantify these missing topologies, we adopt network analysis tools and measure diameter, giant component, and transitivity from networks built by a conventional friends-of-friends recipe with various linking lengths. Unlike the abundance and two-point correlation function, these network quantities reveal a clear separation between the two simulated distributions; therefore, the galaxy distribution simulated by Illustris is not a Lévy fractal quantitatively. We find that the described network quantities offer an efficient tool for discriminating topologies and for comparing observed and theoretical distributions.

  15. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.

  16. Global NLO Analysis of Nuclear Parton Distribution Functions

    SciTech Connect

    Hirai, M.; Kumano, S.; Nagai, T.-H.

    2008-02-21

    Nuclear parton distribution functions (NPDFs) are determined by a global analysis of experimental measurements on structure-function ratios F{sub 2}{sup A}/F{sub 2}{sup A{sup '}} and Drell-Yan cross section ratios {sigma}{sub DY}{sup A}/{sigma}{sub DY}{sup A{sup '}}, and their uncertainties are estimated by the Hessian method. The NPDFs are obtained in both leading order (LO) and next-to-leading order (NLO) of {alpha}{sub s}. As a result, valence-quark distributions are relatively well determined, whereas antiquark distributions at x>0.2 and gluon distributions in the whole x region have large uncertainties. The NLO uncertainties are slightly smaller than the LO ones; however, such a NLO improvement is not as significant as the nucleonic case.

  17. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  18. GIS-based poverty and population distribution analysis in China

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  19. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  20. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  1. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    ERIC Educational Resources Information Center

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  2. Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?

    NASA Technical Reports Server (NTRS)

    Kanefsky, B.; Barlow, N. G.; Gulick, V. C.

    2001-01-01

    We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.

  3. Systematical Analysis on Angular Distribution of Bremsstrahlung Radiation

    SciTech Connect

    Otgooloi, B.; Enkhbat, N.

    2009-03-31

    The systematic analysis has been made the measurement results of the relative angular distribution of gamma quantium with 11 divide 16 MeV energy using experimental data of Ta, W, Cu, Mo and Ti targets with various radiating lengths thicknesses.

  4. Systematical Analysis on Angular Distribution of Bremsstrahlung Radiation

    NASA Astrophysics Data System (ADS)

    Otgooloi, B.; Enkhbat, N.

    2009-03-01

    The systematic analysis has been made the measurement results of the relative angular distribution of gamma quantium with 11÷16 MeV energy using experimental data of Ta, W, Cu, Mo and Ti targets with various radiating lengths thicknesses.

  5. Data synthesis and display programs for wave distribution function analysis

    NASA Technical Reports Server (NTRS)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  6. WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

    EPA Science Inventory

    The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

  7. Assessing tephra total grain-size distribution: Insights from field data analysis

    NASA Astrophysics Data System (ADS)

    Costa, A.; Pioli, L.; Bonadonna, C.

    2016-06-01

    The Total Grain-Size Distribution (TGSD) of tephra deposits is crucial for hazard assessment and provides fundamental insights into eruption dynamics. It controls both the mass distribution within the eruptive plume and the sedimentation processes and can provide essential information on the fragmentation mechanisms. TGSD is typically calculated by integrating deposit grain-size at different locations. The result of such integration is affected not only by the number, but also by the spatial distribution and distance from the vent of the sampling sites. In order to evaluate the reliability of TGSDs, we assessed representative sampling distances for pyroclasts of different sizes through dedicated numerical simulations of tephra dispersal. Results reveal that, depending on wind conditions, a representative grain-size distribution of tephra deposits down to ∼100 μm can be obtained by integrating samples collected at distances from less than one tenth up to a few tens of the column height. The statistical properties of TGSDs representative of a range of eruption styles were calculated by fitting the data with a few general distributions given by the sum of two log-normal distributions (bi-Gaussian in Φ-units), the sum of two Weibull distributions, and a generalized log-logistic distribution for the cumulative number distributions. The main parameters of the bi-lognormal fitting correlate with height of the eruptive columns and magma viscosity, allowing general relationships to be used for estimating TGSD generated in a variety of eruptive styles and for different magma compositions. Fitting results of the cumulative number distribution show two different power law trends for coarse and fine fractions of tephra particles, respectively. Our results shed light on the complex processes that control the size of particles being injected into the atmosphere during volcanic explosive eruptions and represent the first attempt to assess TGSD on the basis of pivotal physical

  8. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    PubMed Central

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  9. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    PubMed

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  10. Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers.

    PubMed

    Markiewicz, Iwona; Strupczewski, Witold G; Bogdanowicz, Ewa; Kochanek, Krzysztof

    2015-01-01

    Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles. PMID:26657239

  11. Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers

    PubMed Central

    Markiewicz, Iwona; Strupczewski, Witold G.; Bogdanowicz, Ewa; Kochanek, Krzysztof

    2015-01-01

    Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles. PMID:26657239

  12. Generalized Exponential Distribution in Flood Frequency Analysis for Polish Rivers.

    PubMed

    Markiewicz, Iwona; Strupczewski, Witold G; Bogdanowicz, Ewa; Kochanek, Krzysztof

    2015-01-01

    Many distributions have been used in flood frequency analysis (FFA) for fitting the flood extremes data. However, as shown in the paper, the scatter of Polish data plotted on the moment ratio diagram shows that there is still room for a new model. In the paper, we study the usefulness of the generalized exponential (GE) distribution in flood frequency analysis for Polish Rivers. We investigate the fit of GE distribution to the Polish data of the maximum flows in comparison with the inverse Gaussian (IG) distribution, which in our previous studies showed the best fitting among several models commonly used in FFA. Since the use of a discrimination procedure without the knowledge of its performance for the considered probability density functions may lead to erroneous conclusions, we compare the probability of correct selection for the GE and IG distributions along with the analysis of the asymptotic model error in respect to the upper quantile values. As an application, both GE and IG distributions are alternatively assumed for describing the annual peak flows for several gauging stations of Polish Rivers. To find the best fitting model, four discrimination procedures are used. In turn, they are based on the maximized logarithm of the likelihood function (K procedure), on the density function of the scale transformation maximal invariant (QK procedure), on the Kolmogorov-Smirnov statistics (KS procedure) and the fourth procedure based on the differences between the ML estimate of 1% quantile and its value assessed by the method of moments and linear moments, in sequence (R procedure). Due to the uncertainty of choosing the best model, the method of aggregation is applied to estimate of the maximum flow quantiles.

  13. Integrating software architectures for distributed simulations and simulation analysis communities.

    SciTech Connect

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  14. Spatial analysis of the distribution of Lyme disease in Wisconsin.

    PubMed

    Kitron, U; Kazmierczak, J J

    1997-03-15

    Surveillance measures for human cases of Lyme disease in Wisconsin were compared and associated with tick distribution and vegetation coverage. During 1991-1994, 1,759 confirmed human cases of Lyme disease reported to the Wisconsin Division of Health were assigned a county of residence, but only 329 (19%) could be assigned with certainty a county of exposure. Distributions of cases by county of exposure and residence were often consistent from year to year. Tick distribution in 46 of 72 Wisconsin counties was mapped based on collections by researchers, statewide surveys of infested deer, and submissions from the public. Satellite data were used to calculate a normalized difference vegetation index (NDVI) for each county. A geographic information system (GIS) was used to map distributions of human Lyme disease cases, ticks, and degree of vegetation cover. Human case distribution by county of exposure was significantly correlated with tick distribution; both were positively correlated with high NDVI values in spring and fall, when wooded vegetation could be distinguished from agricultural crops in the satellite image. Statistical analysis of spatial patterns using a measure of spatial autocorrelation indicated that counties with most human cases and ticks were clustered in parts of western Wisconsin. A map delineating the counties with highest risk for Lyme disease transmission was generated based on numbers of exposed human cases and tick concentrations. PMID:9063347

  15. Spatial Distribution Analysis of Scrub Typhus in Korea

    PubMed Central

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does not directly affect the incidence rate. Conclusion: GIS analysis shows the spatial characteristics of scrub typhus. This research can be used to construct a spatial-temporal model to understand the epidemic tsutsugamushi. PMID:24159523

  16. Volumetric relief map for intracranial cerebrospinal fluid distribution analysis.

    PubMed

    Lebret, Alain; Kenmochi, Yukiko; Hodel, Jérôme; Rahmouni, Alain; Decq, Philippe; Petit, Éric

    2015-09-01

    Cerebrospinal fluid imaging plays a significant role in the clinical diagnosis of brain disorders, such as hydrocephalus and Alzheimer's disease. While three-dimensional images of cerebrospinal fluid are very detailed, the complex structures they contain can be time-consuming and laborious to interpret. This paper presents a simple technique that represents the intracranial cerebrospinal fluid distribution as a two-dimensional image in such a way that the total fluid volume is preserved. We call this a volumetric relief map, and show its effectiveness in a characterization and analysis of fluid distributions and networks in hydrocephalus patients and healthy adults.

  17. Analysis of georadar data to estimate the snow depth distribution

    NASA Astrophysics Data System (ADS)

    Godio, A.; Rege, R. B.

    2016-06-01

    We have performed extensive georadar surveys for mapping the snow depth in the basin of Breuil-Cervinia (Aosta Valley) in the Italian Alps, close to the Matterhorn. More than 9 km of georadar profiles were acquired in April 2008 and 15 km in April 2009, distributed on an hydrological basin of about 12 km2. Radar surveys were carried out partially on the iced area of Ventina glacier at elevation higher than 3000 m a.s.l. and partially at lower elevation (2500 m-3000 m) on the gently slopes of the basin where the winter snow accumulated directly on the ground surface. The snow distribution on the basin, at the end of the season, could vary significantly according to the elevation range, exposition and ground morphology. In small catchment the snow depth reached 6-7 m. At higher elevation, on the glacier, a more homogeneous distribution is usually observed. A descriptive statistical analysis of the dataset is discussed to demonstrate the high spatial variability of the snow depth distribution in the area. The probability distribution of the snow depth fits the gamma distribution with a good correlation. Instead we didn't found any satisfactory relationship of the snow depth with the main morphological parameters of the terrain (elevation, slope, curvature). This suggests that the snow distribution, at the end of the winter season, is mainly conditioned by the transport phenomena and re-distribution of the wind action. The comparison of the results of georadar surveys with the hand probe measurements points out the low accuracy of the snow depth estimate in the area by using conventional hand probing approach only, encouraging to develop technology for fast and accurate mapping of the snow depth at the scale of basin.

  18. Multi-Scale Distributed Sensitivity Analysis of Radiative Transfer Model

    NASA Astrophysics Data System (ADS)

    Neelam, M.; Mohanty, B.

    2015-12-01

    Amidst nature's great variability and complexity and Soil Moisture Active Passive (SMAP) mission aims to provide high resolution soil moisture products for earth sciences applications. One of the biggest challenges still faced by the remote sensing community are the uncertainties, heterogeneities and scaling exhibited by soil, land cover, topography, precipitation etc. At each spatial scale, there are different levels of uncertainties and heterogeneities. Also, each land surface variable derived from various satellite mission comes with their own error margins. As such, soil moisture retrieval accuracy is affected as radiative model sensitivity changes with space, time, and scale. In this paper, we explore the distributed sensitivity analysis of radiative model under different hydro-climates and spatial scales, 1.5 km, 3 km, 9km and 39km. This analysis is conducted in three different regions Iowa, U.S.A (SMEX02), Arizona, USA (SMEX04) and Winnipeg, Canada (SMAPVEX12). Distributed variables such as soil moisture, soil texture, vegetation and temperature are assumed to be uncertain and are conditionally simulated to obtain uncertain maps, whereas roughness data which is spatially limited are assumed a probability distribution. The relative contribution of the uncertain model inputs to the aggregated model output is also studied, using various aggregation techniques. We use global sensitivity analysis (GSA) to conduct this analysis across spatio-temporal scales. Keywords: Soil moisture, radiative transfer, remote sensing, sensitivity, SMEX02, SMAPVEX12.

  19. Distribution selection for hydrologic frequency analysis using subsampling method

    NASA Astrophysics Data System (ADS)

    Das, S.

    2016-08-01

    This paper investigates the potential utility of subsampling, a resampling technique with the aid of a goodness of fit test to select the best distribution for frequency analysis. Subsampling draws samples (of smaller size) from the original sample without replacement. The performance of the methodology is assessed by applying the methodology to an observed annual maximum (AM) hydrologic data series. Several AM discharge series of different record lengths are used as case studies to determine the performance. Overall, it is found that the methodology is suitable for a longer data series and a good performance can be obtained when the subsample size is around half of the underlying data sample. The methodology has also outperformed the standard AD test in terms of effectively discriminating between distributions. All results indicate that the subsampling technique can be a promising tool in discriminating between distributions.

  20. Numerical analysis of the dynamics of distributed vortex configurations

    NASA Astrophysics Data System (ADS)

    Govorukhin, V. N.

    2016-08-01

    A numerical algorithm is proposed for analyzing the dynamics of distributed plane vortex configurations in an inviscid incompressible fluid. At every time step, the algorithm involves the computation of unsteady vortex flows, an analysis of the configuration structure with the help of heuristic criteria, the visualization of the distribution of marked particles and vorticity, the construction of streamlines of fluid particles, and the computation of the field of local Lyapunov exponents. The inviscid incompressible fluid dynamic equations are solved by applying a meshless vortex method. The algorithm is used to investigate the interaction of two and three identical distributed vortices with various initial positions in the flow region with and without the Coriolis force.

  1. Comparing distributions of environmental outcomes for regulatory environmental justice analysis.

    PubMed

    Maguire, Kelly; Sheriff, Glenn

    2011-05-01

    Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146

  2. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  3. GIS analysis of fluvial knickzone distribution in Japanese mountain watersheds

    NASA Astrophysics Data System (ADS)

    Hayakawa, Yuichi S.; Oguchi, Takashi

    2009-10-01

    Although a knickzone, a location at which stream gradient is locally large and intense erosion occurs, has been regarded as an important geomorphic feature in bedrock river morphology, the distribution of knickzones has not been well investigated especially for broad area. This study examines the distribution of fluvial knickzones along mountain rivers for the entire Japanese Archipelago. Whereas conventional manual methods of identifying knickzones based on map readings or field observations tend to be subjective and are impractical for a broad-scale analysis, this study employs a semi-automated method of knickzone extraction using DEMs and GIS. In a recent study by the authors, this method has been shown to enable efficient examination of knickzone distribution over a broad area. Investigations on major mountain rivers revealed that knickzones are generally abundant in upstream steep river reaches, suggesting hydraulic origins for the knickzones. The broad presence of such knickzones in the steep Japanese mountain rivers indicates that rivers subjected to active erosion show complex morphology induced by natural irregularities of water flow hydraulics as well as various environmental perturbations such as climatic changes. There also seems to be a characteristic frequency of knickzone distribution common to moderately steep to very steep bedrock reaches in Japan. Although volcanic products such as lavas and welded pyroclastic-flow deposits in valleys can cause distinct knickzones, substrate geology plays only a limited role in determining the distribution and form of knickzones.

  4. Electrical Power Distribution and Control Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Fu, Johnny S.; Liffring, Mark; Mehdi, Ishaque S.

    2001-01-01

    This slide presentation reviews the use of Electrical Power Distribution and Control (EPD&C) Modeling and how modeling can support analysis. The presentation discusses using the EASY5 model to simulate and analyze the Space Shuttle Electric Auxiliary Power Unit. Diagrams of the model schematics are included, as well as graphs of the battery cell impedance, hydraulic load dynamics, and EPD&C response to hydraulic load variations.

  5. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  6. Human leptospirosis distribution pattern analysis in Hulu Langat, Selangor

    NASA Astrophysics Data System (ADS)

    Zulkifli, Zuhafiza; Shariff, Abdul Rashid Mohamed; Tarmidi, Zakri M.

    2016-06-01

    This paper discussed the distribution pattern of human leptospirosis in the Hulu Langat District, Selangor, Malaysia. The data used in this study is leptospirosis cases’ report, and spatial boundaries. Leptospirosis cases, data were collected from Health Office of Hulu Langat and spatial boundaries, including lot and district boundaries was collected from the Department of Mapping and Surveying Malaysia (JUPEM). A total of 599 leptospirosis cases were reported in 2013, and this data was mapped based on the addresses provided in the leptospirosis cases’ report. This study uses three statistical methods to analyze the distribution pattern; Moran's I, average nearest neighborhood (ANN) and kernel density estimation. The analysis was used to determine the spatial distribution and the average distance of leptospirosis cases and located the hotspot locations. Using Moran's I analysis, results indicated the cases were random, with a value of -0.202816 which show negative spatial autocorrelation exist among leptospirosis cases. The ANN analysis result, indicated the cases are in cluster pattern, with value of the average nearest neighbor ratio is -21.80. And results also show the hotspots are has been identified and mapped in the Hulu Langat District.

  7. Distributed and interactive visual analysis of omics data.

    PubMed

    Farag, Yehia; Berven, Frode S; Jonassen, Inge; Petersen, Kjell; Barsnes, Harald

    2015-11-01

    The amount of publicly shared proteomics data has grown exponentially over the last decade as the solutions for sharing and storing the data have improved. However, the use of the data is often limited by the manner of which it is made available. There are two main approaches: download and inspect the proteomics data locally, or interact with the data via one or more web pages. The first is limited by having to download the data and thus requires local computational skills and resources, while the latter most often is limited in terms of interactivity and the analysis options available. A solution is to develop web-based systems supporting distributed and fully interactive visual analysis of proteomics data. The use of a distributed architecture makes it possible to perform the computational analysis at the server, while the results of the analysis can be displayed via a web browser without the need to download the whole dataset. Here the challenges related to developing such systems for omics data will be discussed. Especially how this allows for multiple connected interactive visual displays of omics dataset in a web-based setting, and the benefits this provide for computational analysis of proteomics data.This article is part of a Special Issue entitled: Computational Proteomics.

  8. GPS FOM Chimney Analysis using Generalized Extreme Value Distribution

    NASA Technical Reports Server (NTRS)

    Ott, Rick; Frisbee, Joe; Saha, Kanan

    2004-01-01

    Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.

  9. Impact of nonzero boresight pointing errors on the performance of a relay-assisted free-space optical communication system over exponentiated Weibull fading channels.

    PubMed

    Wang, Ping; Liu, Xiaoxia; Cao, Tian; Fu, Huihua; Wang, Ranran; Guo, Lixin

    2016-09-20

    The impact of nonzero boresight pointing errors on the system performance of decode-and-forward protocol-based multihop parallel optical wireless communication systems is studied. For the aggregated fading channel, the atmospheric turbulence is simulated by an exponentiated Weibull model, and pointing errors are described by one recently proposed statistical model including both boresight and jitter. The binary phase-shift keying subcarrier intensity modulation-based analytical average bit error rate (ABER) and outage probability expressions are achieved for a nonidentically and independently distributed system. The ABER and outage probability are then analyzed with different turbulence strengths, receiving aperture sizes, structure parameters (P and Q), jitter variances, and boresight displacements. The results show that aperture averaging offers almost the same system performance improvement with boresight included or not, despite the values of P and Q. The performance enhancement owing to the increase of cooperative path (P) is more evident with nonzero boresight than that with zero boresight (jitter only), whereas the performance deterioration because of the increasing hops (Q) with nonzero boresight is almost the same as that with zero boresight. Monte Carlo simulation is offered to verify the validity of ABER and outage probability expressions.

  10. Impact of nonzero boresight pointing errors on the performance of a relay-assisted free-space optical communication system over exponentiated Weibull fading channels.

    PubMed

    Wang, Ping; Liu, Xiaoxia; Cao, Tian; Fu, Huihua; Wang, Ranran; Guo, Lixin

    2016-09-20

    The impact of nonzero boresight pointing errors on the system performance of decode-and-forward protocol-based multihop parallel optical wireless communication systems is studied. For the aggregated fading channel, the atmospheric turbulence is simulated by an exponentiated Weibull model, and pointing errors are described by one recently proposed statistical model including both boresight and jitter. The binary phase-shift keying subcarrier intensity modulation-based analytical average bit error rate (ABER) and outage probability expressions are achieved for a nonidentically and independently distributed system. The ABER and outage probability are then analyzed with different turbulence strengths, receiving aperture sizes, structure parameters (P and Q), jitter variances, and boresight displacements. The results show that aperture averaging offers almost the same system performance improvement with boresight included or not, despite the values of P and Q. The performance enhancement owing to the increase of cooperative path (P) is more evident with nonzero boresight than that with zero boresight (jitter only), whereas the performance deterioration because of the increasing hops (Q) with nonzero boresight is almost the same as that with zero boresight. Monte Carlo simulation is offered to verify the validity of ABER and outage probability expressions. PMID:27661587

  11. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  12. Componential distribution analysis of food using near infrared ray image

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  13. Iterative Monte Carlo analysis of spin-dependent parton distributions

    NASA Astrophysics Data System (ADS)

    Sato, Nobuo; Melnitchouk, W.; Kuhn, S. E.; Ethier, J. J.; Accardi, A.; Jefferson Lab Angular Momentum Collaboration

    2016-04-01

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳0.1 . The study also provides the first determination of the flavor-separated twist-3 PDFs and the d2 moment of the nucleon within a global PDF analysis.

  14. Cost analysis of gas distribution industry with spatial variables

    SciTech Connect

    Kim, Tai-Yoo; Lee, Jeong-Dong

    1995-12-31

    Cost assessment is important in the regulatory process, but it is not easy to effect, especially for distribution sector, because the spatial conditions as well as the output quantity play a major role in determining the cost. The hedonic cost function is introduced to incorporate the spatial characteristics (or network configurations) in the analysis of cost behavior of the Korean gas industry. The findings in this paper are that (1) almost all of the firms are exhausting their scale economies, (2) the average cost trend can be expressed as a surface of output quantity and spatial characteristics, and (3) the imaginary firm`s cost trend is derived by the regression approach. Industries that are related to electricity (water, railroad, and telecommunications, etc.) have the same cost property as the gas distribution industry, and the basic result and methodology in this paper would be applicable to these industries.

  15. Spatial Distribution Balance Analysis of Hospitals in Wuhan

    PubMed Central

    Yang, Nai; Chen, Shiyi; Hu, Weilu; Wu, Zhongheng; Chao, Yi

    2016-01-01

    The spatial distribution pattern of hospitals in Wuhan indicates a core in the central urban areas and a sparse distribution in the suburbs, particularly at the center of suburbs. This study aims to improve the gravity and Huff models to analyze healthcare accessibility and resources. Results indicate that healthcare accessibility in central urban areas is better than in the suburbs, where it increasingly worsens for the suburbs. A shortage of healthcare resources is observed in large-scale and high-class hospitals in central urban areas, whereas the resources of some hospitals in the suburbs are redundant. This study proposes the multi-criteria evaluation (MCE) analysis model for the location assessment in constructing new hospitals, which can effectively ameliorate healthcare accessibility in suburban areas. This study presents implications for the planning of urban healthcare facilities. PMID:27706069

  16. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  17. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  18. Distributed analysis environment for HEP and interdisciplinary applications

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.

    2003-04-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project ( http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results.

  19. Multiobjective sensitivity analysis and optimization of distributed hydrologic model MOBIDIC

    NASA Astrophysics Data System (ADS)

    Yang, J.; Castelli, F.; Chen, Y.

    2014-10-01

    Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives that arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for the MOBIDIC (MOdello di Bilancio Idrologico DIstribuito e Continuo) distributed hydrologic model, which combines two sensitivity analysis techniques (the Morris method and the state-dependent parameter (SDP) method) with multiobjective optimization (MOO) approach ɛ-NSGAII (Non-dominated Sorting Genetic Algorithm-II). This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina, with three objective functions, i.e., the standardized root mean square error (SRMSE) of logarithmic transformed discharge, the water balance index, and the mean absolute error of the logarithmic transformed flow duration curve, and its results were compared with those of a single objective optimization (SOO) with the traditional Nelder-Mead simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show that (1) the two sensitivity analysis techniques are effective and efficient for determining the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization. (2) Both MOO and SOO lead to acceptable simulations; e.g., for MOO, the average Nash-Sutcliffe value is 0.75 in the calibration period and 0.70 in the validation period. (3) Evaporation and surface runoff show similar importance for watershed water balance, while the contribution of baseflow can be ignored. (4) Compared to SOO, which was dependent on the initial starting location, MOO provides more

  20. The analysis and distribution of mescaline in postmortem tissues.

    PubMed

    Henry, Joni L; Epley, Jahna; Rohrig, Timothy P

    2003-09-01

    Mescaline (3,4,5-trimethoxyphenethylamine) is a hallucinogenic alkaloid found in the peyote cactus. This report documents mescaline distribution in a death caused by multiple gunshot wounds. Mescaline was extracted with a butyl chloride liquid-liquid method and identified by mass spectrometry. Quantitative analysis was performed by gas chromatography using a nitrogen-phosphorus detector. Concentrations of the drug were 2.95 mg/L, 2.36 mg/L, 8.2 mg/kg, and 2.2 mg/kg in blood, vitreous, liver, and brain, respectively.

  1. Numerical analysis of decoy state quantum key distribution protocols

    SciTech Connect

    Harrington, Jim W; Rice, Patrick R

    2008-01-01

    Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

  2. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  3. Modal distribution analysis of vibrato in musical signals

    NASA Astrophysics Data System (ADS)

    Mellody, Maureen; Wakefield, Gregory H.

    1998-10-01

    Due to the nonstationary nature of vibrator notes, standard Fourier analysis techniques may not sufficiently characterize the partials of notes undergoing vibrato. Our study employs the modal distribution, a bilinear time-frequency representation, to analyze vibrato signals. Instantaneous frequency and amplitude values for each partial are extracted using Hilbert techniques applied to local neighborhoods of the time-frequency surface. We consider vibrato in violin and vocal performance. Our study confirms the presence of both amplitude modulation and frequency modulation in the partials of notes generated by each of these instruments, and provides a fine-grained analysis of these variations. In addition, we show that these instantaneous amplitude and frequency estimates can be incorporated into methods for synthesizing signals that perceptually resemble the original sampled sounds.

  4. Distribution of Deformation on Cyprus, Inferences from Morphotectonic Analysis

    NASA Astrophysics Data System (ADS)

    Altinbas, Cevza; Yildirim, Cengiz; Tuysuz, Okan; Melnick, Daniel

    2016-04-01

    Cyprus is located on the subduction zone between African and Anatolian Plates. The topography of the island is a result of distributed deformation associated with the subduction related processes in the south of the Central Anatolian Plateau. Trodos and Kyrenia mountains are major morphotectonic units that integrally tied to plate boundary deformations. To elucidate the mode and pattern of active deformation and possible effects of subduction related processes on topography, we integrated morphometric and topographical analysis across the island. Our regional morphometric analysis rely on topographical swath profiles and topographic residuals to identify regional topographic anomalies, as well as steepness and concavity values of longitudinal river profiles that may reflect ongoing uplift. Accordingly, our swath profiles indicate an assymmetric topography across the Troodos Massif and Kyrenia Range. South of Trodos Massif indicates relatively less disected surfaces that partly associated with marine terraces of Quaternary. Our topographical resudial analysis indicate also strong relief assymmetry on the Troodos Massif that might be related to the Arakapas Fault and lithological contact between Neogene and Pre-Neogene rocks. In the north of the island the Kyrenia Range is characterized by a narrow, steep and long range that is delimited by the Ovgos Fault in the south. Our swath profiles across the range display also strong southward assymmetry. The southern flank is steeper in comparison to northern flank. The steepness index value of the rivers on the southern flank of the Kyrenia Range do not give strong signal along the Ovgos Fault. Neverthess, longitudinal profiles of rivers reveal evident deviations from degraded river profiles in the northern flank. Together with the presence of uplifted marine terraces along the northern flank that might indicate the presence of onshore structure(s) responsible for coastal uplift or regional uplift of the island because of

  5. BME analysis of spatiotemporal particulate matter distributions in North Carolina

    NASA Astrophysics Data System (ADS)

    Christakos, George; Serre, Marc L.

    Spatiotemporal maps of particulate matter (PM) concentrations contribute considerably to the understanding of the underlying natural processes and the adequate assessment of the PM health effects. These maps should be derived using an approach that combines rigorous mathematical formulation with sound science. To achieve such a task, the PM 10 distribution in the state of North Carolina is studied using the Bayesian maximum entropy (BME) mapping method. This method is based on a realistic representation of the spatiotemporal domain, which can integrate rigorously and efficiently various forms of physical knowledge and sources of uncertainty. BME offers a complete characterization of PM 10 concentration patterns in terms of multi-point probability distributions and allows considerable flexibility regarding the choice of the appropriate concentration estimates. The PM 10 maps show significant variability both spatially and temporally, a finding that may be associated with geographical characteristics, climatic changes, seasonal patterns, and random fluctuations. The inherently spatiotemporal nature of PM 10 variation is demonstrated by means of theoretical considerations as well as in terms of the more accurate PM 10 predictions of composite space/time analysis compared to spatial estimation. It is shown that the study of PM 10 distributions in North Carolina can be improved by properly incorporating uncertain data into the mapping process, whereas more informative estimates are generated by considering soft data at the estimation points. Uncertainty maps illustrate the significance of stochastic PM 10 characterization in space/time, and identify limitations associated with inadequate interpolation techniques. Stochastic PM 10 analysis has important applications in the optimization of monitoring networks in space and time, environmental risk assessment, health management and administration, etc.

  6. Specimen type and size effects on lithium hydride tensile strength distributions

    SciTech Connect

    Oakes, Jr, R E

    1991-12-01

    Weibull's two-parameter statistical-distribution function is used to account for the effects of specimen size and loading differences on strength distributions of lithium hydride. Three distinctly differing uniaxial specimen types (i.e., an elliptical-transition pure tensile specimen, an internally pressurized ring tensile, and two sizes of four-point-flexure specimens) are shown to provide different strength distributions as expected, because of their differing sizes and modes of loading. After separation of strengths into volumetric- and surface-initiated failure distributions, the Weibull characteristic strength parameters for the higher-strength tests associated with internal fracture initiations are shown to vary as predicted by the effective specimen volume Weibull relationship. Lower-strength results correlate with the effective area to much lesser degree, probably because of the limited number of surface-related failures and the different machining methods used to prepare the specimen. The strength distribution from the fourth specimen type, the predominantly equibiaxially stressed disk-flexure specimen, is well below that predicted by the two-parameter Weibull-derived effective volume or surface area relations. The two-parameter Weibull model cannot account for the increased failure probability associated with multiaxial stress fields. Derivations of effective volume and area relationships for those specimens for which none were found in the literature, the elliptical-transition tensile, the ring tensile, and the disk flexure (including the outer region), are also included.

  7. Effects of specimen size on the flexural strength and Weibull modulus of nuclear graphite IG-110, NBG-18, and PCEA

    NASA Astrophysics Data System (ADS)

    Chi, Se-Hwan

    2015-09-01

    Changes in flexural strength and Weibull modulus due to specimen size were investigated for three nuclear graphite grades, IG-110, NBG-18, and PCEA, using four-point-1/3 point (4-1/3) loading with specimens of three different sizes: 3.18 (Thickness) × 6.35 (Width) × 50.8 (Length), 6.50 (T) × 12.0 (W) × 52.0 (L), 18.0 (T) × 16.0 (W) × 64 (L) (mm) (total: 210 specimens). Results showed some specimen size effects were grade dependent: While NBG-18 (a) showed rather significant specimen size effects (37% difference between the 3 T and 18 T), the differences in IG-110 and PCEA were 7.6-15%. The maximum differences in flexural strength due to specimen size were larger in the PCEA and NBG-18 having larger sized coke particles (medium grain size: >300 μm) than the IG-110 with super fine coke particle size (25 μm). The Weibull modulus showed a data population dependency, in that it decreased with increasing numbers of data used for modulus determination. A good correlation between the fracture surface roughness and the flexural strength was confirmed.

  8. Lacunarity and multifractal analysis of the large DLA mass distribution

    NASA Astrophysics Data System (ADS)

    Rodriguez-Romo, Suemi; Sosa-Herrera, Antonio

    2013-08-01

    We show the methodology used to analyze fractal and mass-multifractal properties of very large Diffusion-Limited Aggregation (DLA) clusters with a maximum of 109 particles for 2D aggregates and 108 particles for 3D clusters, to support our main result; the scaling behavior obtained by our experimental results corresponds to the expected performance of monofractal objects. In order to estimate lacunarity measures for large DLA clusters, we develop a variant of the gliding-box algorithm which reduces the computer time needed to obtain experimental results. We show how our mass multifractal data have a tendency to present monofractal behavior for the mass distribution of the cases presented in this paper in the limit of very large clusters. Lacunarity analysis shows, provided we study small clusters mass distributions, data which might be interpreted as two different values of fractal dimensions while the cluster grows; however, this effect tends to vanish when the cluster size increases further, in such a way that monofractality is achieved. The outcomes of this paper lead us to conclude that the previously reported mass multifractality behavior (Vicsek et al., 1990 [13]) detected for DLA clusters is a consequence of finite size effects and floating point precision limitations and not an intrinsic feature of the phenomena, since the scaling behavior of our DLA clusters space corresponds to monofractal objects, being this situation remarkably noticeable in the limit of very large clusters.

  9. A Distributed Flocking Approach for Information Stream Clustering Analysis

    SciTech Connect

    Cui, Xiaohui; Potok, Thomas E

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  10. A meta-analysis of parton distribution functions

    NASA Astrophysics Data System (ADS)

    Gao, Jun; Nadolsky, Pavel

    2014-07-01

    A "meta-analysis" is a method for comparison and combination of nonperturbative parton distribution functions (PDFs) in a nucleon obtained with heterogeneous procedures and assumptions. Each input parton distribution set is converted into a "meta-parametrization" based on a common functional form. By analyzing parameters of the meta-parametrizations from all input PDF ensembles, a combined PDF ensemble can be produced that has a smaller total number of PDF member sets than the original ensembles. The meta-parametrizations simplify the computation of the PDF uncertainty in theoretical predictions and provide an alternative to the 2010 PDF4LHC convention for combination of PDF uncertainties. As a practical example, we construct a META ensemble for computation of QCD observables at the Large Hadron Collider using the next-to-next-to-leading order PDF sets from CTEQ, MSTW, and NNPDF groups as the input. The META ensemble includes a central set that reproduces the average of LHC predictions based on the three input PDF ensembles and Hessian eigenvector sets for computing the combined PDF+α s uncertainty at a common QCD coupling strength of 0.118.

  11. Phylogenetic analysis on the soil bacteria distributed in karst forest

    PubMed Central

    Zhou, JunPei; Huang, Ying; Mo, MingHe

    2009-01-01

    Phylogenetic composition of bacterial community in soil of a karst forest was analyzed by culture-independent molecular approach. The bacterial 16S rRNA gene was amplified directly from soil DNA and cloned to generate a library. After screening the clone library by RFLP, 16S rRNA genes of representative clones were sequenced and the bacterial community was analyzed phylogenetically. The 16S rRNA gene inserts of 190 clones randomly selected were analyzed by RFLP and generated 126 different RFLP types. After sequencing, 126 non-chimeric sequences were obtained, generating 113 phylotypes. Phylogenetic analysis revealed that the bacteria distributed in soil of the karst forest included the members assigning into Proteobacteria, Acidobacteria, Planctomycetes, Chloroflexi (Green nonsulfur bacteria), Bacteroidetes, Verrucomicrobia, Nitrospirae, Actinobacteria (High G+C Gram-positive bacteria), Firmicutes (Low G+C Gram-positive bacteria) and candidate divisions (including the SPAM and GN08). PMID:24031430

  12. Conductance Distributions for Empirical Orthogonal Function Analysis and Optimal Interpolation

    NASA Astrophysics Data System (ADS)

    Knipp, Delores; McGranaghan, Ryan; Matsuo, Tomoko

    2016-04-01

    We show the first characterizations of the primary modes of ionospheric Hall and Pedersen conductance variability as empirical orthogonal functions (EOFs). These are derived from six satellite years of Defense Meteorological Satellite Program (DMSP) particle data acquired during the rise of solar cycles 22 and 24. The 60 million DMSP spectra were each processed through the Global Airlglow Model. This is the first large-scale analysis of ionospheric conductances completely free of assumption of the incident electron energy spectra. We show that the mean patterns and first four EOFs capture ˜50.1 and 52.9% of the total Pedersen and Hall conductance variabilities, respectively. The mean patterns and first EOFs are consistent with typical diffuse auroral oval structures and quiet time strengthening/weakening of the mean pattern. The second and third EOFs show major disturbance features of magnetosphere-ionosphere (MI) interactions: geomagnetically induced auroral zone expansion in EOF2 and the auroral substorm current wedge in EOF3. The fourth EOFs suggest diminished conductance associated with ionospheric substorm recovery mode. These EOFs are then used in a new optimal interpolation (OI) technique to estimate complete high-latitude ionospheric conductance distributions. The technique combines particle precipitation-based calculations of ionospheric conductances and their errors with a background model and its error covariance (estimated by EOF analysis) to infer complete distributions of the high-latitude ionospheric conductances for a week in late 2011. The OI technique captures: 1) smaller-scaler ionospheric conductance features associated with discrete precipitation and 2) brings ground- and space-based data into closer agreement. We show quantitatively and qualitatively that this new technique provides better ionospheric conductance specification than past statistical models, especially during heightened geomagnetic activity.

  13. Directional spatial frequency analysis of lipid distribution in atherosclerotic plaque

    NASA Astrophysics Data System (ADS)

    Korn, Clyde; Reese, Eric; Shi, Lingyan; Alfano, Robert; Russell, Stewart

    2016-04-01

    Atherosclerosis is characterized by the growth of fibrous plaques due to the retention of cholesterol and lipids within the artery wall, which can lead to vessel occlusion and cardiac events. One way to evaluate arterial disease is to quantify the amount of lipid present in these plaques, since a higher disease burden is characterized by a higher concentration of lipid. Although therapeutic stimulation of reverse cholesterol transport to reduce cholesterol deposits in plaque has not produced significant results, this may be due to current image analysis methods which use averaging techniques to calculate the total amount of lipid in the plaque without regard to spatial distribution, thereby discarding information that may have significance in marking response to therapy. Here we use Directional Fourier Spatial Frequency (DFSF) analysis to generate a characteristic spatial frequency spectrum for atherosclerotic plaques from C57 Black 6 mice both treated and untreated with a cholesterol scavenging nanoparticle. We then use the Cauchy product of these spectra to classify the images with a support vector machine (SVM). Our results indicate that treated plaque can be distinguished from untreated plaque using this method, where no difference is seen using the spatial averaging method. This work has the potential to increase the effectiveness of current in-vivo methods of plaque detection that also use averaging methods, such as laser speckle imaging and Raman spectroscopy.

  14. An Open Architecture for Distributed Malware Collection and Analysis

    NASA Astrophysics Data System (ADS)

    Cavalca, Davide; Goldoni, Emanuele

    Honeynets have become an important tool for researchers and network operators. However, the lack of a unified honeynet data model has impeded their effectiveness, resulting in multiple unrelated data sources, each with its own proprietary access method and format. Moreover, the deployment and management of a honeynet is a time-consuming activity and the interpretation of collected data is far from trivial. HIVE (Honeynet Infrastructure in Virtualized Environment) is a novel highly scalable automated data collection and analysis architecture we designed. Our infrastructure is based on top of proven FLOSS (Free, Libre and Open Source) solutions, which have been extended and integrated with new tools we developed. We use virtualization to ease honeypot management and deployment, combining both high-interaction and low-interaction sensors in a common infrastructure. We also address the need for rapid comprehension and detailed data analysis by harnessing the power of a relational database system, which provides centralized storage and access to the collected data while ensuring its constant integrity. This chapter presents our malware data collection architecture, offering some insight in the structure and benefits of a distributed virtualized honeynet and its development. Finally, we present some techniques for the active monitoring of centralized botnets we integrated in HIVE, which allow us to track the menaces evolution and timely deploy effective countermeasures.

  15. Clustering analysis of seismicity and aftershock identification.

    PubMed

    Zaliapin, Ilya; Gabrielov, Andrei; Keilis-Borok, Vladimir; Wong, Henry

    2008-07-01

    We introduce a statistical methodology for clustering analysis of seismicity in the time-space-energy domain and use it to establish the existence of two statistically distinct populations of earthquakes: clustered and nonclustered. This result can be used, in particular, for nonparametric aftershock identification. The proposed approach expands the analysis of Baiesi and Paczuski [Phys. Rev. E 69, 066106 (2004)10.1103/PhysRevE.69.066106] based on the space-time-magnitude nearest-neighbor distance eta between earthquakes. We show that for a homogeneous Poisson marked point field with exponential marks, the distance eta has the Weibull distribution, which bridges our results with classical correlation analysis for point fields. The joint 2D distribution of spatial and temporal components of eta is used to identify the clustered part of a point field. The proposed technique is applied to several seismicity models and to the observed seismicity of southern California.

  16. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  17. Statistical distribution of mechanical properties for three graphite-epoxy material systems

    NASA Technical Reports Server (NTRS)

    Reese, C.; Sorem, J., Jr.

    1981-01-01

    Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

  18. Comparative analysis of aerosols elemental distribution in some Romanian regions

    NASA Astrophysics Data System (ADS)

    Amemiya, Susumu; Masuda, Toshio; Popa-Simil, Liviu; Mateescu, Liviu

    1996-04-01

    The study's main aim is obtaining aerosols particulate elemental distribution and mapping it for some Romanian regions, in order to obtain preliminary information regarding the concentrations of aerosol particles and networking strategy versus local conditions. For this we used the mobile sampling strategy, but taking care on all local specific conditions and weather. In the summer of 1993, in July we took about 8 samples on a rather large territory of SE Romania which were analysed and mapped. The regions which showed an interesting behaviour or doubts such as Bucharest and Dobrogea were zoomed in near the same period of 1994, for comparing the new details with the global aspect previously obtained. An attempt was made to infer the minimum necessary number of stations in a future monitoring network. A mobile sampler was used, having tow polycarbonate filter posts of 8 and 0.4 μm. PIXE elemental analysis was performed on a 2.5 MV Van de Graaff accelerator, by using a proton beam. More than 15 elements were measured. Suggestive 2D and 3D representations were drawn, as well as histogram charts for the concentrations' distribution in the specific regions at the specified times. In spite of the poor samples from the qualitative point of view the experiment surprised us by the good coincidence (good agreement) with realities in terrain known by other means long time ago, and highlighted the power of PIXE methods in terms of money and time. Conclusions over the link between industry, traffic, vegetation, wether, surface waters, soil composition, power plant exhaust and so on, on the one hand, and surface concentration distribution, on the other, were drawn. But the method's weak points were also highlighted; these are weather dependencies (especially air masses movement and precipitation), local relief, microclimate and vegetation, and of course localisation of the sampling point versus the pollution sources and their regime. The paper contains a synthesis of the whole

  19. Application of extreme learning machine for estimation of wind speed distribution

    NASA Astrophysics Data System (ADS)

    Shamshirband, Shahaboddin; Mohammadi, Kasra; Tong, Chong Wen; Petković, Dalibor; Porcu, Emilio; Mostafaeipour, Ali; Ch, Sudheer; Sedaghat, Ahmad

    2016-03-01

    The knowledge of the probabilistic wind speed distribution is of particular significance in reliable evaluation of the wind energy potential and effective adoption of site specific wind turbines. Among all proposed probability density functions, the two-parameter Weibull function has been extensively endorsed and utilized to model wind speeds and express wind speed distribution in various locations. In this research work, extreme learning machine (ELM) is employed to compute the shape ( k) and scale ( c) factors of Weibull distribution function. The developed ELM model is trained and tested based upon two widely successful methods used to estimate k and c parameters. The efficiency and accuracy of ELM is compared against support vector machine, artificial neural network and genetic programming for estimating the same Weibull parameters. The survey results reveal that applying ELM approach is eventuated in attaining further precision for estimation of both Weibull parameters compared to other methods evaluated. Mean absolute percentage error, mean absolute bias error and root mean square error for k are 8.4600 %, 0.1783 and 0.2371, while for c are 0.2143 %, 0.0118 and 0.0192 m/s, respectively. In conclusion, it is conclusively found that application of ELM is particularly promising as an alternative method to estimate Weibull k and c factors.

  20. Time domain analysis of the weighted distributed order rheological model

    NASA Astrophysics Data System (ADS)

    Cao, Lili; Pu, Hai; Li, Yan; Li, Ming

    2016-05-01

    This paper presents the fundamental solution and relevant properties of the weighted distributed order rheological model in the time domain. Based on the construction of distributed order damper and the idea of distributed order element networks, this paper studies the weighted distributed order operator of the rheological model, a generalization of distributed order linear rheological model. The inverse Laplace transform on weighted distributed order operators of rheological model has been obtained by cutting the complex plane and computing the complex path integral along the Hankel path, which leads to the asymptotic property and boundary discussions. The relaxation response to weighted distributed order rheological model is analyzed, and it is closely related to many physical phenomena. A number of novel characteristics of weighted distributed order rheological model, such as power-law decay and intermediate phenomenon, have been discovered as well. And meanwhile several illustrated examples play important role in validating these results.

  1. Bayesian estimation of generalized exponential distribution under noninformative priors

    NASA Astrophysics Data System (ADS)

    Moala, Fernando Antonio; Achcar, Jorge Alberto; Tomazella, Vera Lúcia Damasceno

    2012-10-01

    The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional noninformative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.

  2. Stability Analysis of Distributed Order Fractional Chen System

    PubMed Central

    Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

  3. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity

    PubMed Central

    Englehardt, James D.

    2015-01-01

    Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263

  4. Rod internal pressure quantification and distribution analysis using Frapcon

    SciTech Connect

    Bratton, Ryan N; Jessee, Matthew Anderson; Wieselquist, William A

    2015-09-01

    This report documents work performed supporting the Department of Energy (DOE) Office of Nuclear Energy (NE) Fuel Cycle Technologies Used Fuel Disposition Campaign (UFDC) under work breakdown structure element 1.02.08.10, ST Analysis. In particular, this report fulfills the M4 milestone M4FT- 15OR0810036, Quantify effects of power uncertainty on fuel assembly characteristics, within work package FT-15OR081003 ST Analysis-ORNL. This research was also supported by the Consortium for Advanced Simulation of Light Water Reactors (http://www.casl.gov), an Energy Innovation Hub (http://www.energy.gov/hubs) for Modeling and Simulation of Nuclear Reactors under U.S. Department of Energy Contract No. DE-AC05-00OR22725. The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified for Watts Bar Nuclear Unit 1 (WBN1) fuel rods by modeling core cycle design data, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. A methodology is developed which tracks inter-cycle assembly movements and assembly batch fabrication information to build individual FRAPCON inputs for each evaluated WBN1 fuel rod. An alternate model for the amount of helium released from the zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layer is derived and applied to FRAPCON output data to quantify the RIP and CHS for these types of fuel rods. SCALE/Polaris is used to quantify fuel rodspecific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel pellets. The primary contributor to elevated RIP predictions at burnups less than and greater than 30 GWd

  5. Fourier analysis of polar cap electric field and current distributions

    NASA Technical Reports Server (NTRS)

    Barbosa, D. D.

    1984-01-01

    A theoretical study of high-latitude electric fields and currents, using analytic Fourier analysis methods, is conducted. A two-dimensional planar model of the ionosphere with an enhanced conductivity auroral belt and field-aligned currents at the edges is employed. Two separate topics are treated. A field-aligned current element near the cusp region of the polar cap is included to investigate the modifications to the convection pattern by the east-west component of the interplanetary magnetic field. It is shown that a sizable one-cell structure is induced near the cusp which diverts equipotential contours to the dawnside or duskside, depending on the sign of the cusp current. This produces characteristic dawn-dusk asymmetries to the electric field that have been previously observed over the polar cap. The second topic is concerned with the electric field configuration obtained in the limit of perfect shielding, where the field is totally excluded equatorward of the auroral oval. When realistic field-aligned current distributions are used, the result is to produce severely distorted, crescent-shaped equipotential contours over the cap. Exact, analytic formulae applicable to this case are also provided.

  6. Finite-key security analysis for multilevel quantum key distribution

    NASA Astrophysics Data System (ADS)

    Brádler, Kamil; Mirhosseini, Mohammad; Fickler, Robert; Broadbent, Anne; Boyd, Robert

    2016-07-01

    We present a detailed security analysis of a d-dimensional quantum key distribution protocol based on two and three mutually unbiased bases (MUBs) both in an asymptotic and finite-key-length scenario. The finite secret key rates (in bits per detected photon) are calculated as a function of the length of the sifted key by (i) generalizing the uncertainly relation-based insight from BB84 to any d-level 2-MUB QKD protocol and (ii) by adopting recent advances in the second-order asymptotics for finite block length quantum coding (for both d-level 2- and 3-MUB QKD protocols). Since the finite and asymptotic secret key rates increase with d and the number of MUBs (together with the tolerable threshold) such QKD schemes could in principle offer an important advantage over BB84. We discuss the possibility of an experimental realization of the 3-MUB QKD protocol with the orbital angular momentum degrees of freedom of photons.

  7. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  8. Analysis Model for Domestic Hot Water Distribution Systems: Preprint

    SciTech Connect

    Maguire, J.; Krarti, M.; Fang, X.

    2011-11-01

    A thermal model was developed to estimate the energy losses from prototypical domestic hot water (DHW) distribution systems for homes. The developed model, using the TRNSYS simulation software, allows researchers and designers to better evaluate the performance of hot water distribution systems in homes. Modeling results were compared with past experimental study results and showed good agreement.

  9. Analysis of Trap Distribution Using Time-of-Flight Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ohno, Akira; Hanna, Jun-ichi; Dunlap, David H.

    2008-02-01

    A new analytical method for determining trap distribution from a transient photocurrent in time-of-flight (TOF) measurements has been proposed in the context of convection diffusion equation with multiple-trapping and detrapping processes. The method does not need, in principle, data on temperature dependence and any initial assumption about the form of trap distribution. A trap distribution is directly extracted from time profiles of transient photocurrents on assuming the Einstein relation between mobility and diffusion constant. To demonstrate the validity of the method, we first applied photocurrents that were prepared in advance by random walk simulation for some typical trap distributions assumed. Then, we attempt to determine a trap distribution for a particular mesophase of a liquid crystal of phenylnaphthalene derivative, for which the temperature dependence of carrier transport properties is hardly available. Indeed, we have obtained an extrinsic shallow trap distribution at about 200 meV in depth together with a tail-shaped Gaussian-type density-of-states distribution. Thus, we conclude that the method may be a powerful tool to analyze a trap distribution for a system that exhibits temperature-sensitive conformational changes and/or whose carrier transport properties are not available as a function of temperature.

  10. Analysis of DNS cache effects on query distribution.

    PubMed

    Wang, Zheng

    2013-01-01

    This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally.

  11. Analysis of temperature distribution in liquid-cooled turbine blades

    NASA Technical Reports Server (NTRS)

    Livingood, John N B; Brown, W Byron

    1952-01-01

    The temperature distribution in liquid-cooled turbine blades determines the amount of cooling required to reduce the blade temperature to permissible values at specified locations. This report presents analytical methods for computing temperature distributions in liquid-cooled turbine blades, or in simplified shapes used to approximate sections of the blade. The individual analyses are first presented in terms of their mathematical development. By means of numerical examples, comparisons are made between simplified and more complete solutions and the effects of several variables are examined. Nondimensional charts to simplify some temperature-distribution calculations are also given.

  12. Determination analysis of energy conservation standards for distribution transformers

    SciTech Connect

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  13. Nanocrystal size distribution analysis from transmission electron microscopy images

    NASA Astrophysics Data System (ADS)

    van Sebille, Martijn; van der Maaten, Laurens J. P.; Xie, Ling; Jarolimek, Karol; Santbergen, Rudi; van Swaaij, René A. C. M. M.; Leifer, Klaus; Zeman, Miro

    2015-12-01

    We propose a method, with minimal bias caused by user input, to quickly detect and measure the nanocrystal size distribution from transmission electron microscopy (TEM) images using a combination of Laplacian of Gaussian filters and non-maximum suppression. We demonstrate the proposed method on bright-field TEM images of an a-SiC:H sample containing embedded silicon nanocrystals with varying magnifications and we compare the accuracy and speed with size distributions obtained by manual measurements, a thresholding method and PEBBLES. Finally, we analytically consider the error induced by slicing nanocrystals during TEM sample preparation on the measured nanocrystal size distribution and formulate an equation to correct this effect.We propose a method, with minimal bias caused by user input, to quickly detect and measure the nanocrystal size distribution from transmission electron microscopy (TEM) images using a combination of Laplacian of Gaussian filters and non-maximum suppression. We demonstrate the proposed method on bright-field TEM images of an a-SiC:H sample containing embedded silicon nanocrystals with varying magnifications and we compare the accuracy and speed with size distributions obtained by manual measurements, a thresholding method and PEBBLES. Finally, we analytically consider the error induced by slicing nanocrystals during TEM sample preparation on the measured nanocrystal size distribution and formulate an equation to correct this effect. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06292f

  14. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  15. Distributional Assumptions in Educational Assessments Analysis: Normal Distributions versus Generalized Beta Distribution in Modeling the Phenomenon of Learning

    ERIC Educational Resources Information Center

    Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

    2013-01-01

    This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

  16. Statistical analysis and modelling of small satellite reliability

    NASA Astrophysics Data System (ADS)

    Guo, Jian; Monas, Liora; Gill, Eberhard

    2014-05-01

    This paper attempts to characterize failure behaviour of small satellites through statistical analysis of actual in-orbit failures. A unique Small Satellite Anomalies Database comprising empirical failure data of 222 small satellites has been developed. A nonparametric analysis of the failure data has been implemented by means of a Kaplan-Meier estimation. An innovative modelling method, i.e. Bayesian theory in combination with Markov Chain Monte Carlo (MCMC) simulations, has been proposed to model the reliability of small satellites. An extensive parametric analysis using the Bayesian/MCMC method has been performed to fit a Weibull distribution to the data. The influence of several characteristics such as the design lifetime, mass, launch year, mission type and the type of satellite developers on the reliability has been analyzed. The results clearly show the infant mortality of small satellites. Compared with the classical maximum-likelihood estimation methods, the proposed Bayesian/MCMC method results in better fitting Weibull models and is especially suitable for reliability modelling where only very limited failures are observed.

  17. Income distribution dependence of poverty measure: A theoretical analysis

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Amit K.; Mallick, Sushanta K.

    2007-04-01

    Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the ‘global’ mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy.

  18. Analysis and machine mapping of the distribution of band recoveries

    USGS Publications Warehouse

    Cowardin, L.M.

    1977-01-01

    A method of calculating distance and bearing from banding site to recovery location based on the solution of a spherical triangle is presented. X and Y distances on an ordinate grid were applied to computer plotting of recoveries on a map. The advantages and disadvantages of tables of recoveries by State or degree block, axial lines, and distance of recovery from banding site for presentation and comparison of the spatial distribution of band recoveries are discussed. A special web-shaped partition formed by concentric circles about the point of banding and great circles at 30-degree intervals through the point of banding has certain advantages over other methods. Comparison of distributions by means of a X? contingency test is illustrated. The statistic V = X?/N can be used as a measure of difference between two distributions of band recoveries and its possible use is illustrated as a measure of the degree of migrational homing.

  19. A Distributed Information Analysis for Information Search Tasks

    PubMed Central

    Gong, Yang; Zhang, Jiajie

    2005-01-01

    Information search in a distributed environment is an interactive process that involves both retrieval and the processing of information across users and artifacts. How the information is distributed across internal representations and external representations affects the efficacy of information search. Using a human-centered method called UFuRT, we developed an information search model and a taxonomy of search tasks. We further developed several prototypes of information search interfaces with different patterns of distributed information and investigated the relations between search tasks and interface types. The results from the analyses show that UFuRT is a useful process that not only provides design guidelines but also generates estimates of representational efficiencies, task complexities and user behavioral outcomes. PMID:16779252

  20. A fractal approach to dynamic inference and distribution analysis

    PubMed Central

    van Rooij, Marieke M. J. W.; Nash, Bertha A.; Rajaraman, Srinivasan; Holden, John G.

    2013-01-01

    Event-distributions inform scientists about the variability and dispersion of repeated measurements. This dispersion can be understood from a complex systems perspective, and quantified in terms of fractal geometry. The key premise is that a distribution's shape reveals information about the governing dynamics of the system that gave rise to the distribution. Two categories of characteristic dynamics are distinguished: additive systems governed by component-dominant dynamics and multiplicative or interdependent systems governed by interaction-dominant dynamics. A logic by which systems governed by interaction-dominant dynamics are expected to yield mixtures of lognormal and inverse power-law samples is discussed. These mixtures are described by a so-called cocktail model of response times derived from human cognitive performances. The overarching goals of this article are twofold: First, to offer readers an introduction to this theoretical perspective and second, to offer an overview of the related statistical methods. PMID:23372552

  1. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  2. High Resolution PV Power Modeling for Distribution Circuit Analysis

    SciTech Connect

    Norris, B. L.; Dise, J. H.

    2013-09-01

    NREL has contracted with Clean Power Research to provide 1-minute simulation datasets of PV systems located at three high penetration distribution feeders in the service territory of Southern California Edison (SCE): Porterville, Palmdale, and Fontana, California. The resulting PV simulations will be used to separately model the electrical circuits to determine the impacts of PV on circuit operations.

  3. Metagenomic Analysis of Water Distribution System Bacterial Communities

    EPA Science Inventory

    The microbial quality of drinking water is assessed using culture-based methods that are highly selective and that tend to underestimate the densities and diversity of microbial populations inhabiting distribution systems. In order to better understand the effect of different dis...

  4. A two-scale Weibull approach to the failure of porous ceramic structures made by robocasting: possibilities and limits.

    PubMed

    Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P

    2013-04-01

    This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds' constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold's properties (as rods' successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics' mechanical properties.

  5. Adaptive Weibull Multiplicative Model and Multilayer Perceptron neural networks for dark-spot detection from SAR imagery.

    PubMed

    Taravat, Alireza; Oppelt, Natascha

    2014-12-02

    Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies.

  6. Survival of Campylobacter jejuni in mineral bottled water according to difference in mineral content: application of the Weibull model.

    PubMed

    Guillou, S; Leguerinel, I; Garrec, N; Renard, M A; Cappelier, J M; Federighi, M

    2008-04-01

    The aim of the study was to examine the hypothesis proposed by Evans et al. [2003. Hazards of healthy living: bottled water and salad vegetables as risk factors for Campylobacter infection. Emerg. Infect. Dis. 9(10), 1219-1225] that mineral bottled water accidentally contaminated by Campylobacter jejuni would represent a risk factor for Campylobacter infection. Culturability of C. jejuni cells inoculated in low- and high-mineral bottled water during storage at 4 degrees C in the dark was performed by surface plating and modelled using the Weibull model. The loss of C. jejuni culturability observed in all conditions tested was shown to be dependent on strain, preculture condition and water composition. Following inoculation of C. jejuni, the rapid loss of culturability was not correlated to complete cell death as the passage into embryonated eggs enabled recovery of cells from the viable but non-culturable state. In conclusion, the sanitary risk associated with contaminated bottled water cannot be excluded although it is presumably low. Culture conditions, strain and water type must be taken into account in the evaluation of the risk factors as they influence significantly Campylobacter survival in water.

  7. A two-scale Weibull approach to the failure of porous ceramic structures made by robocasting: possibilities and limits

    PubMed Central

    Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.

    2012-01-01

    This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936

  8. Adaptive Weibull Multiplicative Model and Multilayer Perceptron Neural Networks for Dark-Spot Detection from SAR Imagery

    PubMed Central

    Taravat, Alireza; Oppelt, Natascha

    2014-01-01

    Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376

  9. Nonlinear structural analysis on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Watson, Brian C.; Noor, Ahmed K.

    1995-01-01

    A computational strategy is presented for the nonlinear static and postbuckling analyses of large complex structures on massively parallel computers. The strategy is designed for distributed-memory, message-passing parallel computer systems. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a nested dissection (or multilevel substructuring) ordering scheme; (3) parallel assembly of global matrices; and (4) a parallel sparse equation solver. The effectiveness of the strategy is assessed by applying it to thermo-mechanical postbuckling analyses of stiffened composite panels with cutouts, and nonlinear large-deflection analyses of HSCT models on Intel Paragon XP/S computers. The numerical studies presented demonstrate the advantages of nested dissection-based solvers over traditional skyline-based solvers on distributed memory machines.

  10. Directional data analysis under the general projected normal distribution

    PubMed Central

    Wang, Fangpo; Gelfand, Alan E.

    2013-01-01

    The projected normal distribution is an under-utilized model for explaining directional data. In particular, the general version provides flexibility, e.g., asymmetry and possible bimodality along with convenient regression specification. Here, we clarify the properties of this general class. We also develop fully Bayesian hierarchical models for analyzing circular data using this class. We show how they can be fit using MCMC methods with suitable latent variables. We show how posterior inference for distributional features such as the angular mean direction and concentration can be implemented as well as how prediction within the regression setting can be handled. With regard to model comparison, we argue for an out-of-sample approach using both a predictive likelihood scoring loss criterion and a cumulative rank probability score criterion. PMID:24046539

  11. Analysis of tablet compaction. II. Finite element analysis of density distributions in convex tablets.

    PubMed

    Sinka, I C; Cunningham, J C; Zavaliangos, A

    2004-08-01

    A Drucker-Prager/cap constitutive model, where the elastic and plastic model parameters are expressed as a function of relative density (RD), was presented in a companion article together with experimental calibration procedures. Here, we examine the RD distribution in curved-faced tablets with special reference to the die wall lubrication conditions. The compaction of powders is examined using finite element analysis, which involves the following factors: constitutive behavior of powder, friction between powder and tooling, geometry of die and punches, sequence of punch motions, and initial conditions that result from die fill. The predictions of the model are validated using experimental RD maps. It is shown that different die wall lubrication conditions induce opposite density distribution trends in identical tablets (weight, height, and material). The importance of the internal tablet structure is illustrated with respect to break force, failure mode, and friability: it is demonstrated that for a given average tablet density the break force and failure mode are not unique. Also, tablet regions having lower density locally have higher propensity for damage. The applicability of finite element analysis for optimizations of formulation design, process development, tablet image, and tool design is discussed.

  12. Statistical analysis of dendritic spine distributions in rat hippocampal cultures

    PubMed Central

    2013-01-01

    Background Dendritic spines serve as key computational structures in brain plasticity. Much remains to be learned about their spatial and temporal distribution among neurons. Our aim in this study was to perform exploratory analyses based on the population distributions of dendritic spines with regard to their morphological characteristics and period of growth in dissociated hippocampal neurons. We fit a log-linear model to the contingency table of spine features such as spine type and distance from the soma to first determine which features were important in modeling the spines, as well as the relationships between such features. A multinomial logistic regression was then used to predict the spine types using the features suggested by the log-linear model, along with neighboring spine information. Finally, an important variant of Ripley’s K-function applicable to linear networks was used to study the spatial distribution of spines along dendrites. Results Our study indicated that in the culture system, (i) dendritic spine densities were "completely spatially random", (ii) spine type and distance from the soma were independent quantities, and most importantly, (iii) spines had a tendency to cluster with other spines of the same type. Conclusions Although these results may vary with other systems, our primary contribution is the set of statistical tools for morphological modeling of spines which can be used to assess neuronal cultures following gene manipulation such as RNAi, and to study induced pluripotent stem cells differentiated to neurons. PMID:24088199

  13. ROD INTERNAL PRESSURE QUANTIFICATION AND DISTRIBUTION ANALYSIS USING FRAPCON

    SciTech Connect

    Ivanov, Kostadin; Jessee, Matthew Anderson

    2016-01-01

    The discharge rod internal pressure (RIP) and cladding hoop stress (CHS) distributions are quantified forWatts BarNuclearUnit 1 (WBN1) fuel rods by modeling core cycle design data, intercycle assembly movements, operation data (including modeling significant trips and downpowers), and as-built fuel enrichments and densities of each fuel rod in FRAPCON-3.5. An alternate model for the amount of helium released from zirconium diboride (ZrB2) integral fuel burnable absorber (IFBA) layers is derived and applied to FRAPCON output data to quantify the RIP and CHS for these fuel rods. SCALE/Polaris is used to quantify fuel rod-specific spectral quantities and the amount of gaseous fission products produced in the fuel for use in FRAPCON inputs. Fuel rods with ZrB2 IFBA layers (i.e., IFBA rods) are determined to have RIP predictions that are elevated when compared to fuel rod without IFBA layers (i.e., standard rods) despite the fact that IFBA rods often have reduced fill pressures and annular fuel blankets. Cumulative distribution functions (CDFs) are prepared from the distribution of RIP predictions for all standard and IFBA rods. The provided CDFs allow for the determination of the portion of WBN1 fuel rods that exceed a specified RIP limit. Lastly, improvements to the computational methodology of FRAPCON are proposed.

  14. Archiving, Distribution and Analysis of Solar-B Data

    NASA Astrophysics Data System (ADS)

    Shimojo, M.

    2007-10-01

    The Solar-B Mission Operation and Data Analysis (MODA) working group has been discussing the data analysis system for Solar-B data since 2001. In the paper, based on the Solar-B MODA document and the recent work in Japan, we introduce the dataflow from Solar-B to scientists, the data format and data-level of Solar-B data, and the data searching/providing system.

  15. [Transformations of parameters in the generalized Poisson distribution for test data analysis].

    PubMed

    Ogasawara, H

    1996-02-01

    The generalized Poisson distribution is a distribution which approximates various forms of mixtures of Poisson distributions. The mean and variance of the generalized Poisson distribution, which are simple functions of the two parameters of the distribution, are more useful than the original parameters in test data analysis. Therefore, we adopted two types of transformations of parameters. The first model has new parameters of mean and standard deviation. The second model contains new parameters of mean and variance/mean. An example indicates that the transformed parameters are convenient to understand the properties of data. PMID:8935832

  16. Analysis of a simulated heroin distribution chain by HPLC.

    PubMed

    Zelkowicz, Avraham; Magora, Amir; Ravreby, Mark D; Levy, Rina

    2005-07-01

    A heroin distribution chain was simulated by taking three different seizures and preparing four additional samples from each seizure by adding a paracetamol-caffeine mixture in varying amounts, resulting in three different batches each composed of five samples. All of the samples from the three batches were analyzed using HPLC with a UV-PDA detector at a wavelength of 230 nm. The area ratio of various opium alkaloids, acetylation products and components were compared. From the results of the UV area ratios, the fifteen samples could readily be separated into three batches of five samples, with each batch of five samples having a common origin.

  17. Analysis of the Spatial Distribution of Galaxies by Multiscale Methods

    NASA Astrophysics Data System (ADS)

    Starck, J.-L.; Martínez, V. J.; Donoho, D. L.; Levi, O.; Querre, P.; Saar, E.

    2005-12-01

    Galaxies are arranged in interconnected walls and filaments forming a cosmic web encompassing huge, nearly empty, regions between the structures. Many statistical methods have been proposed in the past in order to describe the galaxy distribution and discriminate the different cosmological models. We present in this paper multiscale geometric transforms sensitive to clusters, sheets, and walls: the 3D isotropic undecimated wavelet transform, the 3D ridgelet transform, and the 3D beamlet transform. We show that statistical properties of transform coefficients measure in a coherent and statistically reliable way, the degree of clustering, filamentarity, sheetedness, and voidedness of a data set.

  18. A global survey on the seasonal variation of the marginal distribution of daily precipitation

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael; Koutsoyiannis, Demetris

    2016-08-01

    To characterize the seasonal variation of the marginal distribution of daily precipitation, it is important to find which statistical characteristics of daily precipitation actually vary the most from month-to-month and which could be regarded to be invariant. Relevant to the latter issue is the question whether there is a single model capable to describe effectively the nonzero daily precipitation for every month worldwide. To study these questions we introduce and apply a novel test for seasonal variation (SV-Test) and explore the performance of two flexible distributions in a massive analysis of approximately 170,000 monthly daily precipitation records at more than 14,000 stations from all over the globe. The analysis indicates that: (a) the shape characteristics of the marginal distribution of daily precipitation, generally, vary over the months, (b) commonly used distributions such as the Exponential, Gamma, Weibull, Lognormal, and the Pareto, are incapable to describe "universally" the daily precipitation, (c) exponential-tail distributions like the Exponential, mixed Exponentials or the Gamma can severely underestimate the magnitude of extreme events and thus may be a wrong choice, and (d) the Burr type XII and the Generalized Gamma distributions are two good models, with the latter performing exceptionally well.

  19. Structural reliability analysis of laminated CMC components

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.

    1991-01-01

    For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.

  20. Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

    SciTech Connect

    Henning, Maria Florencia; Sanchez, Susana; Bakas, Laura

    2009-05-22

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

  1. VISUALIZATION AND ANALYSIS OF LPS DISTRIBUTION IN BINARY PHOSPHOLIPID BILAYERS

    PubMed Central

    Florencia, Henning María; Susana, Sanchez; Laura, Bakás

    2010-01-01

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram negative bacteria during infections. It have been reported that LPS may play a rol in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or Cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4°C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery. PMID:19324006

  2. Multi-species analysis of ion distributions at Mars

    NASA Astrophysics Data System (ADS)

    Curry, S.; Liemohn, M. W.; Fang, X.; Ma, Y.; Johnson, B.; Bougher, S. W.; Dong, C.

    2012-12-01

    This study focuses on using the Mars Test Particle simulation to compare observations with virtual detections of O+, O2+, CO2+, and H+ in an orbital configuration in the Mars space environment. These planetary pick-up ions are formed when the solar wind directly interacts with the neutral atmosphere, causing the ions to be accelerated by the background convective electric field. The subsequent mass loading and ion escape are still the subject of great interest, specifically with respect to which species dominates ion loss from Mars. Modeling efforts and observations have found different results; some conclude that O+ is the most dominant escaping ion while others conclude that O2+ has the larger total loss rate. Furthermore, mass loss might actually favor CO2+ because of its tri-atomic structure. To address this unresolved issue, this study will present velocity space distributions for different species and discuss fluxes and escape rates using different modeling parameters. The simulation will also illustrate individual particle traces, which reveal the origin and trajectories of the different ion species. Finally, results from different solar conditions will be presented with respect to ion fluxes and energies as well as overall escape in order to robustly describe the physical processes controlling planetary ion distributions and atmospheric escape.

  3. Analysis of an algorithm for distributed recognition and accountability

    SciTech Connect

    Ko, C.; Frincke, D.A.; Goan, T. Jr.; Heberlein, L.T.; Levitt, K.; Mukherjee, B.; Wee, C.

    1993-08-01

    Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approach for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.

  4. Studying bubble-particle interactions by zeta potential distribution analysis.

    PubMed

    Wu, Chendi; Wang, Louxiang; Harbottle, David; Masliyah, Jacob; Xu, Zhenghe

    2015-07-01

    Over a decade ago, Xu and Masliyah pioneered an approach to characterize the interactions between particles in dynamic environments of multicomponent systems by measuring zeta potential distributions of individual components and their mixtures. Using a Zetaphoremeter, the measured zeta potential distributions of individual components and their mixtures were used to determine the conditions of preferential attachment in multicomponent particle suspensions. The technique has been applied to study the attachment of nano-sized silica and alumina particles to sub-micron size bubbles in solutions with and without the addition of surface active agents (SDS, DAH and DF250). The degree of attachment between gas bubbles and particles is shown to be a function of the interaction energy governed by the dispersion, electrostatic double layer and hydrophobic forces. Under certain chemical conditions, the attachment of nano-particles to sub-micron size bubbles is shown to be enhanced by in-situ gas nucleation induced by hydrodynamic cavitation for the weakly interacting systems, where mixing of the two individual components results in negligible attachment. Preferential interaction in complex tertiary particle systems demonstrated strong attachment between micron-sized alumina and gas bubbles, with little attachment between micron-sized alumina and silica, possibly due to instability of the aggregates in the shear flow environment.

  5. A data analysis expert system for large established distributed databases

    NASA Technical Reports Server (NTRS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  6. Monsoonal differences and probability distribution of PM(10) concentration.

    PubMed

    Md Yusof, Noor Faizah Fitri; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Sansuddin, Nurulilyana; Ghazali, Nurul Adyani; Al Madhoun, Wesam

    2010-04-01

    There are many factors that influence PM(10) concentration in the atmosphere. This paper will look at the PM(10) concentration in relation with the wet season (north east monsoon) and dry season (south west monsoon) in Seberang Perai, Malaysia from the year 2000 to 2004. It is expected that PM(10) will reach the peak during south west monsoon as the weather during this season becomes dry and this study has proved that the highest PM(10) concentrations in 2000 to 2004 were recorded in this monsoon. Two probability distributions using Weibull and lognormal were used to model the PM(10) concentration. The best model used for prediction was selected based on performance indicators. Lognormal distribution represents the data better than Weibull distribution model for 2000, 2001, and 2002. However, for 2003 and 2004, Weibull distribution represents better than the lognormal distribution. The proposed distributions were successfully used for estimation of exceedences and predicting the return periods of the sequence year. PMID:19365611

  7. Distributed Parallel Computing in Data Analysis of Osteoporosis.

    PubMed

    Waleska Simões, Priscyla; Venson, Ramon; Comunello, Eros; Casagrande, Rogério Antônio; Bigaton, Everson; da Silva Carlessi, Lucas; da Rosa, Maria Inês; Martins, Paulo João

    2015-01-01

    This research aimed to compare the performance of two models of load balancing (Proportional and Autotuned algorithms) of the JPPF platform in the processing of data mining from a database with osteoporosis and osteopenia. When performing the analysis of execution times, it was observed that the Proportional algorithm performed better in all cases.

  8. Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

    NASA Technical Reports Server (NTRS)

    Ozguner, Fusun

    1996-01-01

    Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

  9. A pair distribution function analysis of zeolite beta

    SciTech Connect

    Martinez-Inesta, M.M.; Peral, I.; Proffen, T.; Lobo, R.F.

    2010-07-20

    We describe the structural refinement of zeolite beta using the local structure obtained with the pair distribution function (PDF) method. A high quality synchrotron and two neutron scattering datasets were obtained on two samples of siliceous zeolite beta. The two polytypes that make up zeolite beta have the same local structure; therefore refinement of the two structures was possible using the same experimental PDF. Optimized structures of polytypes A and B were used to refine the structures using the program PDFfit. Refinements using only the synchrotron or the neutron datasets gave results inconsistent with each other but a cyclic refinement with the two datasets gave a good fit to both PDFs. The results show that the PDF method is a viable technique to analyze the local structure of disordered zeolites. However, given the complexity of most zeolite frameworks, the use of both X-ray and neutron radiation and high-resolution patterns is essential to obtain reliable refinements.

  10. Quantitative analysis of inclusion distributions in hot pressed silicon carbide

    SciTech Connect

    Michael Paul Bakas

    2012-12-01

    ABSTRACT Depth of penetration measurements in hot pressed SiC have exhibited significant variability that may be influenced by microstructural defects. To obtain a better understanding regarding the role of microstructural defects under highly dynamic conditions; fragments of hot pressed SiC plates subjected to impact tests were examined. Two types of inclusion defects were identified, carbonaceous and an aluminum-iron-oxide phase. A disproportionate number of large inclusions were found on the rubble, indicating that the inclusion defects were a part of the fragmentation process. Distribution functions were plotted to compare the inclusion populations. Fragments from the superior performing sample had an inclusion population consisting of more numerous but smaller inclusions. One possible explanation for this result is that the superior sample withstood a greater stress before failure, causing a greater number of smaller inclusions to participate in fragmentation than in the weaker sample.

  11. Southern Arizona riparian habitat: Spatial distribution and analysis

    NASA Technical Reports Server (NTRS)

    Lacey, J. R.; Ogden, P. R.; Foster, K. E.

    1975-01-01

    The objectives of this study were centered around the demonstration of remote sensing as an inventory tool and researching the multiple uses of riparian vegetation. Specific study objectives were to: (1) map riparian vegetation along the Gila River, San Simon Creek, San Pedro River, Pantano Wash, (2) determine the feasibility of automated mapping using LANDSAT-1 computer compatible tapes, (3) locate and summarize existing mpas delineating riparian vegetation, (4) summarize data relevant to Southern Arizona's riparian products and uses, (5) document recent riparian vegetation changes along a selected portion of the San Pedro River, (6) summarize historical changes in composition and distribution of riparian vegetation, and (7) summarize sources of available photography pertinent to Southern Arizona.

  12. Photoelastic analysis of stress distribution with different implant systems.

    PubMed

    Pellizzer, Eduardo Piza; Carli, Rafael Imai; Falcón-Antenucci, Rosse Mary; Verri, Fellippo Ramos; Goiato, Marcelo Coelho; Villa, Luiz Marcelo Ribeiro

    2014-04-01

    The aim of this study was to evaluate stress distribution with different implant systems through photoelasticity. Five models were fabricated with photoelastic resin PL-2. Each model was composed of a block of photoelastic resin (10 × 40 × 45 mm) with an implant and a healing abutment: model 1, internal hexagon implant (4.0 × 10 mm; Conect AR, Conexão, São Paulo, Brazil); model 2, Morse taper/internal octagon implant (4.1 × 10 mm; Standard, Straumann ITI, Andover, Mass); model 3, Morse taper implant (4.0 × 10 mm; AR Morse, Conexão); model 4, locking taper implant (4.0 × 11 mm; Bicon, Boston, Mass); model 5, external hexagon implant (4.0 × 10 mm; Master Screw, Conexão). Axial and oblique load (45°) of 150 N were applied by a universal testing machine (EMIC-DL 3000), and a circular polariscope was used to visualize the stress. The results were photographed and analyzed qualitatively using Adobe Photoshop software. For the axial load, the greatest stress concentration was exhibited in the cervical and apical thirds. However, the highest number of isochromatic fringes was observed in the implant apex and in the cervical adjacent to the load direction in all models for the oblique load. Model 2 (Morse taper, internal octagon, Straumann ITI) presented the lowest stress concentration, while model 5 (external hexagon, Master Screw, Conexão) exhibited the greatest stress. It was concluded that Morse taper implants presented a more favorable stress distribution among the test groups. The external hexagon implant showed the highest stress concentration. Oblique load generated the highest stress in all models analyzed.

  13. Microwave circuit analysis and design by a massively distributed computing network

    NASA Astrophysics Data System (ADS)

    Vai, Mankuan; Prasad, Sheila

    1995-05-01

    The advances in microelectronic engineering have rendered massively distributed computing networks practical and affordable. This paper describes one application of this distributed computing paradigm to the analysis and design of microwave circuits. A distributed computing network, constructed in the form of a neural network, is developed to automate the operations typically performed on a normalized Smith chart. Examples showing the use of this computing network for impedance matching and stabilizing are provided.

  14. Cross Section Sensitivity and Uncertainty Analysis Including Secondary Neutron Energy and Angular Distributions.

    1991-03-12

    Version 00 SUSD calculates sensitivity coefficients for one- and two-dimensional transport problems. Variance and standard deviation of detector responses or design parameters can be obtained using cross-section covariance matrices. In neutron transport problems, this code can perform sensitivity-uncertainty analysis for secondary angular distribution (SAD) or secondary energy distribution (SED).

  15. The Distribution of Use of Library Materials: Analysis of Data from the University of Pittsburgh.

    ERIC Educational Resources Information Center

    Hayes, Robert M.

    1981-01-01

    Evaluates the validity of using circulation data as the index to the total utilization of a library collection and tests the hypothesis that the proposed mixture of Poisson distributions can describe and predict various use distribution. Seven statistical analysis listings, six algorithms, 20 tables, and 43 references are provided. (Author/RBF)

  16. Advanced analysis of metal distributions in human hair

    SciTech Connect

    Kempson, Ivan M.; Skinner, William M.

    2008-06-09

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

  17. Advanced analysis of metal distributions in human hair.

    PubMed

    Kempson, Ivan M; Skinner, William M; Kirkbride, K Paul

    2006-05-15

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight--secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function. PMID:16749716

  18. Distributed Finite Element Analysis Using a Transputer Network

    NASA Technical Reports Server (NTRS)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  19. The border-to-border distribution method for analysis of cytoplasmic particles and organelles.

    PubMed

    Yacovone, Shalane K; Ornelles, David A; Lyles, Douglas S

    2016-02-01

    Comparing the distribution of cytoplasmic particles and organelles between different experimental conditions can be challenging due to the heterogeneous nature of cell morphologies. The border-to-border distribution method was created to enable the quantitative analysis of fluorescently labeled cytoplasmic particles and organelles of multiple cells from images obtained by confocal microscopy. The method consists of four steps: (1) imaging of fluorescently labeled cells, (2) division of the image of the cytoplasm into radial segments, (3) selection of segments of interest, and (4) population analysis of fluorescence intensities at the pixel level either as a function of distance along the selected radial segments or as a function of angle around an annulus. The method was validated using the well-characterized effect of brefeldin A (BFA) on the distribution of the vesicular stomatitis virus G protein, in which intensely labeled Golgi membranes are redistributed within the cytoplasm. Surprisingly, in untreated cells, the distribution of fluorescence in Golgi membrane-containing radial segments was similar to the distribution of fluorescence in other G protein-containing segments, indicating that the presence of Golgi membranes did not shift the distribution of G protein towards the nucleus compared to the distribution of G protein in other regions of the cell. Treatment with BFA caused only a slight shift in the distribution of the brightest G protein-containing segments which had a distribution similar to that in untreated cells. Instead, the major effect of BFA was to alter the annular distribution of G protein in the perinuclear region.

  20. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  1. Performance analysis of a fault-tolerant distributed multimedia server

    NASA Astrophysics Data System (ADS)

    Derryberry, Barbara

    1998-12-01

    The evolving demands of networks to support Webtone, H.323, AIN and other advanced services require multimedia servers that can deliver a number of value-added capabilities such as to negotiate protocols, deliver network services, and respond to QoS requests. The server is one of the primary limiters on network capacity. THe next generation server must be based upon a flexible, robust, scalable, and reliable platform to keep abreast with the revolutionary pace of service demand and development while continuing to provide the same dependability that voice networks have provided for decades. A new distributed platform, which is based upon the Totem fault-tolerant messaging system, is described. Processor and network resources are modeled and analyzed. Quantitative results are presented that assess this platform in terms of messaging capacity and performance for various architecture and design options including processing technologies and fault-tolerance modes. The impacts of fault-tolerant messaging are identified based upon analytical modeling of the proposed server architecture.

  2. Classification of Cerebral Lymphomas and Glioblastomas Featuring Luminance Distribution Analysis

    PubMed Central

    Yamasaki, Toshihiko; Chen, Tsuhan; Hirai, Toshinori; Murakami, Ryuji

    2013-01-01

    Differentiating lymphomas and glioblastomas is important for proper treatment planning. A number of works have been proposed but there are still some problems. For example, many works depend on thresholding a single feature value, which is susceptible to noise. In other cases, experienced observers are required to extract the feature values or to provide some interactions with the system. Even if experts are involved, interobserver variance becomes another problem. In addition, most of the works use only one or a few slice(s) because 3D tumor segmentation is time consuming. In this paper, we propose a tumor classification system that analyzes the luminance distribution of the whole tumor region. Typical cases are classified by the luminance range thresholding and the apparent diffusion coefficients (ADC) thresholding. Nontypical cases are classified by a support vector machine (SVM). Most of the processing elements are semiautomatic. Therefore, even novice users can use the system easily and get the same results as experts. The experiments were conducted using 40 MRI datasets. The classification accuracy of the proposed method was 91.1% without the ADC thresholding and 95.4% with the ADC thresholding. On the other hand, the baseline method, the conventional ADC thresholding, yielded only 67.5% accuracy. PMID:23840280

  3. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  4. Motion synthesis and force distribution analysis for a biped robot.

    PubMed

    Trojnacki, Maciej T; Zielińska, Teresa

    2011-01-01

    In this paper, the method of generating biped robot motion using recorded human gait is presented. The recorded data were modified taking into account the velocity available for robot drives. Data includes only selected joint angles, therefore the missing values were obtained considering the dynamic postural stability of the robot, which means obtaining an adequate motion trajectory of the so-called Zero Moment Point (ZMT). Also, the method of determining the ground reaction forces' distribution during the biped robot's dynamic stable walk is described. The method was developed by the authors. Following the description of equations characterizing the dynamics of robot's motion, the values of the components of ground reaction forces were symbolically determined as well as the coordinates of the points of robot's feet contact with the ground. The theoretical considerations have been supported by computer simulation and animation of the robot's motion. This was done using Matlab/Simulink package and Simulink 3D Animation Toolbox, and it has proved the proposed method.

  5. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  6. Correlation Spectroscopy of Minor Species: Signal Purification and Distribution Analysis

    SciTech Connect

    Laurence, T A; Kwon, Y; Yin, E; Hollars, C; Camarero, J A; Barsky, D

    2006-06-21

    We are performing experiments that use fluorescence resonance energy transfer (FRET) and fluorescence correlation spectroscopy (FCS) to monitor the movement of an individual donor-labeled sliding clamp protein molecule along acceptor-labeled DNA. In addition to the FRET signal sought from the sliding clamp-DNA complexes, the detection channel for FRET contains undesirable signal from free sliding clamp and free DNA. When multiple fluorescent species contribute to a correlation signal, it is difficult or impossible to distinguish between contributions from individual species. As a remedy, we introduce ''purified FCS'' (PFCS), which uses single molecule burst analysis to select a species of interest and extract the correlation signal for further analysis. We show that by expanding the correlation region around a burst, the correlated signal is retained and the functional forms of FCS fitting equations remain valid. We demonstrate the use of PFCS in experiments with DNA sliding clamps. We also introduce ''single molecule FCS'', which obtains diffusion time estimates for each burst using expanded correlation regions. By monitoring the detachment of weakly-bound 30-mer DNA oligomers from a single-stranded DNA plasmid, we show that single molecule FCS can distinguish between bursts from species that differ by a factor of 5 in diffusion constant.

  7. Differentiating cerebral lymphomas and GBMs featuring luminance distribution analysis

    NASA Astrophysics Data System (ADS)

    Yamasaki, Toshihiko; Chen, Tsuhan; Hirai, Toshinori; Murakami, Ryuji

    2013-02-01

    Differentiating lymphomas and glioblastoma multiformes (GBMs) is important for proper treatment planning. A number of works have been proposed but there are still some problems. For example, many works depend on thresholding a single feature value, which is susceptible to noise. Non-typical cases that do not get along with such simple thresholding can be found easily. In other cases, experienced observers are required to extract the feature values or to provide some interactions to the system, which is costly. Even if experts are involved, inter-observer variance becomes another problem. In addition, most of the works use only one or a few slice(s) because 3D tumor segmentation is difficult and time-consuming. In this paper, we propose a tumor classification system that analyzes the luminance distribution of the whole tumor region. The 3D MRIs are segmented within a few tens of seconds by using our fast 3D segmentation algorithm. Then, the luminance histogram of the whole tumor region is generated. The typical cases are classified by the histogram range thresholding and the apparent diffusion coefficients (ADC) thresholding. The non-typical cases are learned and classified by a support vector machine (SVM). Most of the processing elements are semi-automatic except for the ADC value extraction. Therefore, even novice users can use the system easily and get almost the same results as experts. The experiments were conducted using 40 MRI datasets (20 lymphomas and 20 GBMs) with non-typical cases. The classification accuracy of the proposed method was 91.1% without the ADC thresholding and 95.4% with the ADC thresholding. On the other hand, the baseline method, the conventional ADC thresholding, yielded only 67.5% accuracy.

  8. Distributed representation as a principle for the analysis of cockpit information displays.

    PubMed

    Zhang, J

    1997-01-01

    This article examines the representational properties of cockpit information displays from the perspective of distributed representations (Zhang & Norman, 1994). The basic idea is that the information needed for many tasks in a cockpit is distributed across the external information displays in the cockpit and the internal minds of the pilots. It is proposed that the relative distribution of internal and external information is the major factor of a display's representational efficiency. Several functionally equivalent but representationally different navigation displays are selected to illustrate how the principle of distributed representations is applied to the analysis of the representational efficiencies of cockpit information displays.

  9. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  10. Independent Orbiter Assessment (IOA): Analysis of the Electrical Power Distribution and Control Subsystem, Volume 2

    NASA Technical Reports Server (NTRS)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

  11. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study

    PubMed Central

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2015-01-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a

  12. A Distributed Processing and Analysis System for Heliophysic Events

    NASA Astrophysics Data System (ADS)

    Hurlburt, N.; Cheung, M.; Bose, P.

    2008-12-01

    With several Virtual Observatories now under active development, the time is ripe to consider how they will interact to enable integrated studies that span the full range of Heliophysics. We present a solution that builds upon components of the Heliophysics Event Knowledgebase (HEK) being developed for the Solar Dynamics Observatory and the Heliophysics Event List Manager (HELMS), recently selected as part of the NASA VxO program. A Heliophysics Event Analysis and Processing System (HEAPS) could increase the scientific productivity of Heliophysics data by increasing the visibility of relevant events contained within them while decreasing the incremental costs of incorporating more events in research studies. Here we present the relevant precursors to such a system and show how it could operate within the Heliophysics Data Environment.

  13. Distribution of Modelling Spatial Processes Using Geostatistical Analysis

    NASA Astrophysics Data System (ADS)

    Grynyshyna-Poliuga, Oksana; Stanislawska, Iwona; Swiatek, Anna

    The Geostatistical Analyst uses sample points taken at different locations in a landscape and creates (interpolates) a continuous surface. The Geostatistical Analyst provides two groups of interpolation techniques: deterministic and geostatistical. All methods rely on the similarity of nearby sample points to create the surface. Deterministic techniques use mathematical functions for interpolation. Geostatistics relies on both statistical and mathematical methods, which can be used to create surfaces and assess the uncertainty of the predictions. The first step in geostatistical analysis is variography: computing and modelling a semivariogram. A semivariogram is one of the significant functions to indicate spatial correlation in observations measured at sample locations. It is commonly represented as a graph that shows the variance in measure with distance between all pairs of sampled locations. Such a graph is helpful to build a mathematical model that describes the variability of the measure with location. Modeling of relationship among sample locations to indicate the variability of the measure with distance of separation is called semivariogram modelling. It is applied to applications involving estimating the value of a measure at a new location. Our work presents the analysis of the data following the steps as given below: identification of data set periods, constructing and modelling the empirical semivariogram for single location and using the Kriging mapping function as modelling of TEC maps in mid-latitude during disturbed and quiet days. Based on the semivariogram, weights for the kriging interpolation are estimated. Additional observations do, in general, not provide relevant extra information to the interpolation, because the spatial correlation is well described with the semivariogram. Keywords: Semivariogram, Kriging, modelling, Geostatistics, TEC

  14. Wavelet analysis of baryon acoustic structures in the galaxy distribution

    NASA Astrophysics Data System (ADS)

    Arnalte-Mur, P.; Labatie, A.; Clerc, N.; Martínez, V. J.; Starck, J.-L.; Lachièze-Rey, M.; Saar, E.; Paredes, S.

    2012-06-01

    Context. Baryon acoustic oscillations (BAO) are imprinted in the density field by acoustic waves travelling in the plasma of the early universe. Their fixed scale can be used as a standard ruler to study the geometry of the universe. Aims: The BAO have been previously detected using correlation functions and power spectra of the galaxy distribution. We present a new method to detect the real-space structures associated with BAO. These baryon acoustic structures are spherical shells of relatively small density contrast, surrounding high density central regions. Methods: We design a specific wavelet adapted to search for shells, and exploit the physics of the process by making use of two different mass tracers, introducing a specific statistic to detect the BAO features. We show the effect of the BAO signal in this new statistic when applied to the Λ - cold dark matter (ΛCDM) model, using an analytical approximation to the transfer function. We confirm the reliability and stability of our method by using cosmological N-body simulations from the MareNostrum Institut de Ciències de l'Espai (MICE). Results: We apply our method to the detection of BAO in a galaxy sample drawn from the Sloan Digital Sky Survey (SDSS). We use the "main" catalogue to trace the shells, and the luminous red galaxies (LRG) as tracers of the high density central regions. Using this new method, we detect, with a high significance, that the LRG in our sample are preferentially located close to the centres of shell-like structures in the density field, with characteristics similar to those expected from BAO. We show that stacking selected shells, we can find their characteristic density profile. Conclusions: We delineate a new feature of the cosmic web, the BAO shells. As these are real spatial structures, the BAO phenomenon can be studied in detail by examining those shells. Full Table 1 is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc

  15. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  16. Characterizing fibrosis in UUO mice model using multiparametric analysis of phasor distribution from FLIM images

    PubMed Central

    Ranjit, Suman; Dvornikov, Alexander; Levi, Moshe; Furgeson, Seth; Gratton, Enrico

    2016-01-01

    Phasor approach to fluorescence lifetime microscopy is used to study development of fibrosis in the unilateral ureteral obstruction model (UUO) of kidney in mice. Traditional phasor analysis has been modified to create a multiparametric analysis scheme that splits the phasor points in four equidistance segments based on the height of peak of the phasor distribution and calculates six parameters including average phasor positions, the shape of each segment, the angle of the distribution and the number of points in each segment. These parameters are used to create a spectrum of twenty four points specific to the phasor distribution of each sample. Comparisons of spectra from diseased and healthy tissues result in quantitative separation and calculation of statistical parameters including AUC values, positive prediction values and sensitivity. This is a new method in the evolving field of analyzing phasor distribution of FLIM data and provides further insights. Additionally, the progression of fibrosis with time is detected using this multiparametric approach to phasor analysis.

  17. Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jianming; Liu, Lihua; Yu, Hua

    2015-12-01

    The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.

  18. Numerical analysis of atomic density distribution in arc driven negative ion sources

    SciTech Connect

    Yamamoto, T. Shibata, T.; Hatayama, A.; Kashiwagi, M.; Hanada, M.; Sawada, K.

    2014-02-15

    The purpose of this study is to calculate atomic (H{sup 0}) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H{sup 0} density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H{sup 0} production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H{sup 0} production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H{sup 0} production and the ionization are enhanced. The calculated H{sup 0} density distribution without the effect of the H{sup 0} transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H{sup 0} distribution.

  19. Incidence, histopathologic analysis and distribution of tumours of the hand

    PubMed Central

    2014-01-01

    Background The aim of this large collective and meticulous study of primary bone tumours and tumourous lesions of the hand was to enhance the knowledge about findings of traumatological radiographs and improve differential diagnosis. Methods This retrospective study reviewed data collected from 1976 until 2006 in our Bone Tumour Registry. The following data was documented: age, sex, radiological investigations, tumour location, histopathological features including type and dignity of the tumour, and diagnosis. Results The retrospective analysis yielded 631 patients with a mean age of 35.9 ± 19.2 years. The majority of primary hand tumours were found in the phalanges (69.7%) followed by 24.7% in metacarpals and 5.6% in the carpals. Only 10.6% of all cases were malignant. The major lesion type was cartilage derived at 69.1%, followed by bone cysts 11.3% and osteogenic tumours 8.7%. The dominant tissue type found in phalanges and metacarpals was of cartilage origin. Osteogenic tumours were predominant in carpal bones. Enchondroma was the most commonly detected tumour in the hand (47.1%). Conclusions All primary skeletal tumours can be found in the hand and are most often of cartilage origin followed by bone cysts and osteogenic tumours. This study furthermore raises awareness about uncommon or rare tumours and helps clinicians to establish proper differential diagnosis, as the majority of detected tumours of the hand are asymptomatic and accidental findings on radiographs. PMID:24885007

  20. Statistical Scalability Analysis of Communication Operations in Distributed Applications

    SciTech Connect

    Vetter, J S; McCracken, M O

    2001-02-27

    Current trends in high performance computing suggest that users will soon have widespread access to clusters of multiprocessors with hundreds, if not thousands, of processors. This unprecedented degree of parallelism will undoubtedly expose scalability limitations in existing applications, where scalability is the ability of a parallel algorithm on a parallel architecture to effectively utilize an increasing number of processors. Users will need precise and automated techniques for detecting the cause of limited scalability. This paper addresses this dilemma. First, we argue that users face numerous challenges in understanding application scalability: managing substantial amounts of experiment data, extracting useful trends from this data, and reconciling performance information with their application's design. Second, we propose a solution to automate this data analysis problem by applying fundamental statistical techniques to scalability experiment data. Finally, we evaluate our operational prototype on several applications, and show that statistical techniques offer an effective strategy for assessing application scalability. In particular, we find that non-parametric correlation of the number of tasks to the ratio of the time for individual communication operations to overall communication time provides a reliable measure for identifying communication operations that scale poorly.

  1. Biomechanical Analysis of Force Distribution in Human Finger Extensor Mechanisms

    PubMed Central

    Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

    2014-01-01

    The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the “Principle of Minimum Total Potential Energy” is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576

  2. Biomechanical analysis of force distribution in human finger extensor mechanisms.

    PubMed

    Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

    2014-01-01

    The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the "Principle of Minimum Total Potential Energy" is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo. PMID:25126576

  3. Biomechanical analysis of force distribution in human finger extensor mechanisms.

    PubMed

    Hu, Dan; Ren, Lei; Howard, David; Zong, Changfu

    2014-01-01

    The complexities of the function and structure of human fingers have long been recognised. The in vivo forces in the human finger tendon network during different activities are critical information for clinical diagnosis, surgical treatment, prosthetic finger design, and biomimetic hand development. In this study, we propose a novel method for in vivo force estimation for the finger tendon network by combining a three-dimensional motion analysis technique and a novel biomechanical tendon network model. The extensor mechanism of a human index finger is represented by an interconnected tendinous network moving around the phalanx's dorsum. A novel analytical approach based on the "Principle of Minimum Total Potential Energy" is used to calculate the forces and deformations throughout the tendon network of the extensor mechanism when subjected to an external load and with the finger posture defined by measurement data. The predicted deformations and forces in the tendon network are in broad agreement with the results obtained by previous experimental in vitro studies. The proposed methodology provides a promising tool for investigating the biomechanical function of complex interconnected tendon networks in vivo.

  4. Testing Nested Distributions

    NASA Astrophysics Data System (ADS)

    Economou, P.

    2010-09-01

    A number of criteria, test statistics and diagnostic plots have been developed in order to test the adapted distribution assumption. Usually, a simpler distribution is tested against a more complicated one (by adding an extra parameter), which include the first distribution as a special case (nested distributions). A characteristic example of such cases is the Burr XII distribution which can be obtained under the proportional hazards frailty model by assuming a Weibull baseline function and a Gamma frailty distribution with mean frailty equal to 1 and variance equal to θ. In this work, two new easy to construct and to interpret tests, a diagnostic plot and an asymptotic test, are presented in order to test nested distributions. The asymptotic test is based on the approximation of the difference of the two estimated nested distribution functions using the first two terms of the Taylor's expansion while the diagnostic plot is constructed using the exact difference of the two fitted distribution functions. Simulation results, using data sets with and without censored observations, demonstrate that the proposed tests perform, in most of the cases, better than other test statistics such as the LR and the Wald.

  5. Progress in Using the Generalized Wigner Distribution in the Analysis of Terrace-Width Distributions of Vicinal Surfaces

    NASA Astrophysics Data System (ADS)

    Cohen, S. D.; Richards, Howard L.; Einstein, T. L.

    2000-03-01

    The so-called generalized Wigner distribution (GWD) may provide at least as good a description of terrace width distributions (TWDs) on vicinal surfaces as the standard Gaussian fit.(T.L. Einstein and O. Pierre-Louis, Surface Sci. 424), L299 (1999). It works well for weak elastic repulsion strengths A between steps (where the latter fails), as illustrated explicitly(S.D. Cohen, H.L. Richards, TLE, and M. Giesen, cond- mat/9911319.) for vicinal Pt(110).( K. Swamy, E. Bertel, and I. Vilfan, Surface Sci. 425), L369 (1999). Applications to vicinal copper surfaces confirms the general viability of the new analysis procedure.(M. Giesen and T.L. Einstein, submitted to Surface Sci.) For troublesome data, we can treat the GWD as a two-parameter fit that allows the terrace widths to be scaled by an optimal effective mean width.^3 With Monte Carlo simulations we show that for physical values of A, the GWD provides a better overall estimate than the Gaussian models. We quantify how a GWD approaches a Gaussian for large A and present a convenient but accurate new expression relating the variance of the TWD to A.^3 We also mention how discreteness of terrace widths impacts the standard continuum analysis.^3

  6. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    NASA Astrophysics Data System (ADS)

    Singh, R.; Percivall, G.

    2009-12-01

    Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

  7. Regression Analysis of Physician Distribution to Identify Areas of Need: Some Preliminary Findings.

    ERIC Educational Resources Information Center

    Morgan, Bruce B.; And Others

    A regression analysis was conducted of factors that help to explain the variance in physician distribution and which identify those factors that influence the maldistribution of physicians. Models were developed for different geographic areas to determine the most appropriate unit of analysis for the Western Missouri Area Health Education Center…

  8. A network analysis of food flows within the United States of America.

    PubMed

    Lin, Xiaowen; Dang, Qian; Konar, Megan

    2014-05-20

    The world food system is globalized and interconnected, in which trade plays an increasingly important role in facilitating food availability. We present a novel application of network analysis to domestic food flows within the USA, a country with global importance as a major agricultural producer and trade power. We find normal node degree distributions and Weibull node strength and betweenness centrality distributions. An unassortative network structure with high clustering coefficients exists. These network properties indicate that the USA food flow network is highly social and well-mixed. However, a power law relationship between node betweenness centrality and node degree indicates potential network vulnerability to the disturbance of key nodes. We perform an equality analysis which serves as a benchmark for global food trade, where the Gini coefficient = 0.579, Lorenz asymmetry coefficient = 0.966, and Hoover index = 0.442. These findings shed insight into trade network scaling and proxy free trade and equitable network architectures.

  9. Integral-moment analysis of the BATSE gamma-ray burst intensity distribution

    NASA Technical Reports Server (NTRS)

    Horack, John M.; Emslie, A. Gordon

    1994-01-01

    We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.

  10. Spherical harmonic analysis of particle velocity distribution function: Comparison of moments and anisotropies using Cluster data

    NASA Astrophysics Data System (ADS)

    Viñas, Adolfo F.; Gurgiolo, Chris

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective ``compression'' technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  11. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    NASA Technical Reports Server (NTRS)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  12. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    NASA Astrophysics Data System (ADS)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  13. SCARE: A post-processor program to MSC/NASTRAN for the reliability analysis of structural ceramic components

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J. P.

    1985-01-01

    A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  14. Time-Score Analysis in Criterion-Referenced Tests. Final Report.

    ERIC Educational Resources Information Center

    Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.

    The family of Weibull distributions was investigated as a model for the distributions of response times for items in computer-based criterion-referenced tests. The fit of these distributions were, with a few exceptions, good to excellent according to the Kolmogorov-Smirnov test. For a few relatively simple items, the two-parameter gamma…

  15. FASEP ultra-automated analysis of fibre length distribution in glass-fibre-reinforced products

    NASA Astrophysics Data System (ADS)

    Hartwich, Mark R.; Höhn, Norbert; Mayr, Helga; Sandau, Konrad; Stengler, Ralph

    2009-06-01

    Reinforced plastic materials are widely used in high sophisticated applications. The length distribution of the fibres influences the mechanical properties of the final product. A method for automatic determination of this length distribution was developed. After separating the fibres out of the composite material without any damage, and preparing them for microscopical analysis, a mosaic of microscope pictures is taken. After image processing and analysis with mathematical methods, a complete statistic of the fibre length distribution could be determined. A correlation between fibre length distribution and mechanical properties, measured e.g. with material test methods, like tensile and impact tests, was found. This is a method to optimize the process and selection of material for the plastic parts. In result this enhances customer satisfaction and, maybe much more important, reduces costs for the manufacturer.

  16. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    SciTech Connect

    Weitzel, E.; Hoeschele, E.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  17. Comparison of photon correlation spectroscopy with photosedimentation analysis for the determination of aqueous colloid size distributions

    USGS Publications Warehouse

    Rees, T.F.

    1990-01-01

    Photon correlation spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS. -from Author

  18. Can Data Recognize Its Parent Distribution?

    SciTech Connect

    A.W.Marshall; J.C.Meza; and I. Olkin

    1999-05-01

    This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

  19. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

  20. Analysis and synthesis of distributed-lumped-active networks by digital computer

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

  1. Analysis of the 3D distribution of stacked self-assembled quantum dots by electron tomography

    PubMed Central

    2012-01-01

    The 3D distribution of self-assembled stacked quantum dots (QDs) is a key parameter to obtain the highest performance in a variety of optoelectronic devices. In this work, we have measured this distribution in 3D using a combined procedure of needle-shaped specimen preparation and electron tomography. We show that conventional 2D measurements of the distribution of QDs are not reliable, and only 3D analysis allows an accurate correlation between the growth design and the structural characteristics. PMID:23249477

  2. Analysis of the spatial distribution between successive earthquakes occurred in various regions in the world

    NASA Astrophysics Data System (ADS)

    Marekova, Elisaveta

    2014-12-01

    The earthquake spatial distribution is being studied, using earthquake catalogs from different seismic regions (California, Canada, Central Asia, Greece, and Japan). The quality of the available catalogs, taking into account the completeness of the magnitude, is examined. Based on the analysis of the catalogs, it was determined that the probability densities of the inter-event distance distribution collapse into single distribution when the data is rescaled. The collapse of the data provides a clear illustration of earthquake-occurrence self-similarity in space.

  3. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  4. powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions

    PubMed Central

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible. PMID:24489671

  5. Size-distribution analysis of macromolecules by sedimentation velocity ultracentrifugation and lamm equation modeling.

    PubMed Central

    Schuck, P

    2000-01-01

    A new method for the size-distribution analysis of polymers by sedimentation velocity analytical ultracentrifugation is described. It exploits the ability of Lamm equation modeling to discriminate between the spreading of the sedimentation boundary arising from sample heterogeneity and from diffusion. Finite element solutions of the Lamm equation for a large number of discrete noninteracting species are combined with maximum entropy regularization to represent a continuous size-distribution. As in the program CONTIN, the parameter governing the regularization constraint is adjusted by variance analysis to a predefined confidence level. Estimates of the partial specific volume and the frictional ratio of the macromolecules are used to calculate the diffusion coefficients, resulting in relatively high-resolution sedimentation coefficient distributions c(s) or molar mass distributions c(M). It can be applied to interference optical data that exhibit systematic noise components, and it does not require solution or solvent plateaus to be established. More details on the size-distribution can be obtained than from van Holde-Weischet analysis. The sensitivity to the values of the regularization parameter and to the shape parameters is explored with the help of simulated sedimentation data of discrete and continuous model size distributions, and by applications to experimental data of continuous and discrete protein mixtures. PMID:10692345

  6. Three-parameter discontinuous distributions for hydrological samples with zero values

    NASA Astrophysics Data System (ADS)

    Weglarczyk, Stanislaw; Strupczewski, Witold G.; Singh, Vijay P.

    2005-10-01

    A consistent approach to the frequency analysis of hydrologic data in arid and semiarid regions, i.e. the data series containing several zero values (e.g. monthly precipitation in dry seasons, annual peak flow discharges, etc.), requires using discontinuous probability distribution functions. Such an approach has received relatively limited attention. Along the lines of physically based models, the extensions of the Muskingum-based models to three parameter forms are considered. Using 44 peak flow series from the USGS data bank, the fitting ability of four three-parameter models was investigated: (1) the Dirac delta combined with Gamma distribution; (2) the Dirac delta combined with two-parameter generalized Pareto distribution; (3) the Dirac delta combined with two-parameter Weibull (DWe) distribution; (4) the kinematic diffusion with one additional parameter that controls the probability of the zero event (KD3). The goodness of fit of the models was assessed and compared both by evaluation of discrepancies between the results of both estimation methods (i.e. the method of moments (MOM) and the maximum likelihood method (MLM)) and using the log of likelihood function as a criterion. In most cases, the DWe distribution with MLM-estimated parameters showed the best fit of all the three-parameter models.

  7. A Weibull model to describe antimicrobial kinetics of oregano and lemongrass essential oils against Salmonella Enteritidis in ground beef during refrigerated storage.

    PubMed

    de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

    2013-03-01

    The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. PMID:23273476

  8. A Weibull model to describe antimicrobial kinetics of oregano and lemongrass essential oils against Salmonella Enteritidis in ground beef during refrigerated storage.

    PubMed

    de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf

    2013-03-01

    The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life.

  9. An analysis of the size distribution of Italian firms by age

    NASA Astrophysics Data System (ADS)

    Cirillo, Pasquale

    2010-02-01

    In this paper we analyze the size distribution of Italian firms by age. In other words, we want to establish whether the way that the size of firms is distributed varies as firms become old. As a proxy of size we use capital. In [L.M.B. Cabral, J. Mata, On the evolution of the firm size distribution: Facts and theory, American Economic Review 93 (2003) 1075-1090], the authors study the distribution of Portuguese firms and they find out that, while the size distribution of all firms is fairly stable over time, the distributions of firms by age groups are appreciably different. In particular, as the age of the firms increases, their size distribution on the log scale shifts to the right, the left tails becomes thinner and the right tail thicker, with a clear decrease of the skewness. In this paper, we perform a similar analysis with Italian firms using the CEBI database, also considering firms’ growth rates. Although there are several papers dealing with Italian firms and their size distribution, to our knowledge a similar study concerning size and age has not been performed yet for Italy, especially with such a big panel.

  10. The Analysis of the Strength, Distribution and Direction for the EEG Phase Synchronization by Musical Stimulus

    NASA Astrophysics Data System (ADS)

    Ogawa, Yutaro; Ikeda, Akira; Kotani, Kiyoshi; Jimbo, Yasuhiko

    In this study, we propose the EEG phase synchronization analysis including not only the average strength of the synchronization but also the distribution and directions under the conditions that evoked emotion by musical stimuli. The experiment is performed with the two different musical stimuli that evoke happiness or sadness for 150 seconds. It is found that the average strength of synchronization indicates no difference between the right side and the left side of the frontal lobe during the happy stimulus, the distribution and directions indicate significant differences. Therefore, proposed analysis is useful for detecting emotional condition because it provides information that cannot be obtained only by the average strength of synchronization.

  11. Hydraulic model analysis of water distribution system, Rockwell International, Rocky Flats, Colorado

    SciTech Connect

    Perstein, J.; Castellano, J.A.

    1989-01-20

    Rockwell International requested an analysis of the existing plant site water supply distribution system at Rocky Flats, Colorado, to determine its adequacy. On September 26--29, 1988, Hughes Associates, Inc., Fire Protection Engineers, accompanied by Rocky Flats Fire Department engineers and suppression personnel, conducted water flow tests at the Rocky Flats plant site. Thirty-seven flows from various points throughout the plant site were taken on the existing domestic supply/fire main installation to assure comprehensive and thorough representation of the Rocky Flats water distribution system capability. The analysis was completed in four phases which are described, together with a summary of general conclusions and recommendations.

  12. Consideration of tip speed limitations in preliminary analysis of minimum COE wind turbines

    NASA Astrophysics Data System (ADS)

    Cuerva-Tejero, A.; Yeow, T. S.; Lopez-Garcia, O.; Gallego-Castillo, C.

    2014-12-01

    A relation between Cost Of Energy, COE, maximum allowed tip speed, and rated wind speed, is obtained for wind turbines with a given goal rated power. The wind regime is characterised by the corresponding parameters of the probability density function of wind speed. The non-dimensional characteristics of the rotor: number of blades, the blade radial distributions of local solidity, twist, angle, and airfoil type, play the role of parameters in the mentioned relation. The COE is estimated using a cost model commonly used by the designers. This cost model requires basic design data such as the rotor radius and the ratio between the hub height and the rotor radius. Certain design options, DO, related to the technology of the power plant, tower and blades are also required as inputs. The function obtained for the COE can be explored to find those values of rotor radius that give rise to minimum cost of energy for a given wind regime as the tip speed limitation changes. The analysis reveals that iso-COE lines evolve parallel to iso-radius lines for large values of limit tip speed but that this is not the case for small values of the tip speed limits. It is concluded that., as the tip speed limit decreases, the optimum decision for keeping minimum COE values can be: a) reducing the rotor radius for places with high weibull scale parameter or b) increasing the rotor radius for places with low weibull scale parameter.

  13. Strategic Sequencing for State Distributed PV Policies: A Quantitative Analysis of Policy Impacts and Interactions

    SciTech Connect

    Doris, E.; Krasko, V.A.

    2012-10-01

    State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

  14. Quantitative statistical analysis of dielectric breakdown in zirconia-based self-assembled nanodielectrics.

    PubMed

    Schlitz, Ruth A; Ha, Young-geun; Marks, Tobin J; Lauhon, Lincoln J

    2012-05-22

    Uniformity of the dielectric breakdown voltage distribution for several thicknesses of a zirconia-based self-assembled nanodielectric was characterized using the Weibull distribution. Two regimes of breakdown behavior are observed: self-assembled multilayers >5 nm thick are well described by a single two-parameter Weibull distribution, with β ≈ 11. Multilayers ≤5 nm thick exhibit kinks on the Weibull plot of dielectric breakdown voltage, suggesting that multiple characteristic mechanisms for dielectric breakdown are present. Both the degree of uniformity and the effective dielectric breakdown field are observed to be greater for one layer than for two layers of Zr-SAND, suggesting that this multilayer is more promising for device applications.

  15. Scaling Analysis of Time Distribution between Successive Earthquakes in Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Marekova, Elisaveta

    2016-08-01

    The earthquake inter-event time distribution is studied, using catalogs for different recent aftershock sequences. For aftershock sequences following the Modified Omori's Formula (MOF) it seems clear that the inter-event distribution is a power law. The parameters of this law are defined and they prove to be higher than the calculated value (2 - 1/p). Based on the analysis of the catalogs, it is determined that the probability densities of the inter-event time distribution collapse into a single master curve when the data is rescaled with instantaneous intensity, R(t; Mth), defined by MOF. The curve is approximated by a gamma distribution. The collapse of the data provides a clear view of aftershock-occurrence self-similarity.

  16. Space positional and motion SRC effects: A comparison with the use of reaction time distribution analysis

    PubMed Central

    Styrkowiec, Piotr; Szczepanowski, Remigiusz

    2013-01-01

    The analysis of reaction time (RT) distributions has become a recognized standard in studies on the stimulus response correspondence (SRC) effect as it allows exploring how this effect changes as a function of response speed. In this study, we compared the spatial SRC effect (the classic Simon effect) with the motion SRC effect using RT distribution analysis. Four experiments were conducted, in which we manipulated factors of space position and motion for stimulus and response, in order to obtain a clear distinction between positional SRC and motion SRC. Results showed that these two types of SRC effects differ in their RT distribution functions as the space positional SRC effect showed a decreasing function, while the motion SRC showed an increasing function. This suggests that different types of codes underlie these two SRC effects. Potential mechanisms and processes are discussed. PMID:24605178

  17. Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.

    PubMed

    Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar

    2010-09-01

    A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.

  18. Analysis of the melanin distribution in different ethnic groups by in vivo laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Antoniou, C.; Lademann, J.; Richter, H.; Astner, S.; Patzelt, A.; Zastrow, L.; Sterry, W.; Koch, S.

    2009-05-01

    The aim of this study was to determine whether Laser scanning confocal microscopy (LSM) is able to visualize differences in melanin content and distribution in different Skin Phototypes. The investigations were carried out on six healthy volunteers with Skin Phototypes II, IV, and VI. Representative skin samples of Skin Phototypes II, V, and VI were obtained for histological analysis from remaining tissue of skin grafts and were used for LSM-pathologic correlation. LSM evaluation showed significant differences in melanin distribution in Skin Phototypes II, IV, and VI, respectively. Based on the differences in overall reflectivity and image brightness, a visual evaluation scheme showed increasing brightness of the basal and suprabasal layers with increasing Skin Phototypes. The findings correlated well with histological analysis. The results demonstrate that LSM may serve as a promising adjunctive tool for real time assessment of melanin content and distribution in human skin, with numerous clinical applications and therapeutic and preventive implications.

  19. Residence Time Distribution Measurement and Analysis of Pilot-Scale Pretreatment Reactors for Biofuels Production: Preprint

    SciTech Connect

    Sievers, D.; Kuhn, E.; Tucker, M.; Stickel, J.; Wolfrum, E.

    2013-06-01

    Measurement and analysis of residence time distribution (RTD) data is the focus of this study where data collection methods were developed specifically for the pretreatment reactor environment. Augmented physical sampling and automated online detection methods were developed and applied. Both the measurement techniques themselves and the produced RTD data are presented and discussed.

  20. An investigation on the intra-sample distribution of cotton color by using image analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The colorimeter principle is widely used to measure cotton color. This method provides the sample’s color grade; but the result does not include information about the color distribution and any variation within the sample. We conducted an investigation that used image analysis method to study the ...

  1. Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Berezowski, T.; Nossent, J.; Chormański, J.; Batelaan, O.

    2015-04-01

    As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly increasing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis method for spatial input data (snow cover fraction - SCF) for a distributed rainfall-runoff model to investigate when the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focussed on the relation between the SCF sensitivity and the physical and spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland, for which a distributed WetSpa model is set up to simulate 2 years of daily runoff. The sensitivity analysis uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which employs different response functions for each spatial parameter representing a 4 × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as geomorphology, soil texture, land use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for our spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model. The developed method can be easily applied to other models and other spatial data.

  2. Infrared thermoimage analysis as real time technique to evaluate in-field pesticide spraying quality distribution

    NASA Astrophysics Data System (ADS)

    Menesatti, P.; Biocca, M.

    2007-09-01

    Tests and calibration of sprayers have been considered a very important task for chemicals use reduction in agriculture and for improvement of plant phytosanitary protection. A reliable, affordable and easy-to-use method to observe the distribution in the field is required and the infrared thermoimage analysis can be considered as a potential method based on non-contact imaging technologies. The basic idea is that the application of colder water (10 °C less) than the leaves surface makes it possible to distinguish and measure the targeted areas by means of a infrared thermoimage analysis based on significant and time persistent thermal differences. Trials were carried out on a hedge of Prunus laurocerasus, 2.1 m height with an homogenous canopy. A trailed orchard sprayer was employed with different spraying configurations. A FLIR TM (S40) thermocamera was used to acquire (@ 50 Hz) thermal videos, in a fixed position, at frame rate of 10 images/s, for nearly 3 min. Distribution quality was compared to the temperature differences obtained from the thermal images between pre-treatment and post-treatment (ΔT)., according two analysis: time-trend of ΔT average values for different hedge heights and imaging ΔT distribution and area coverage by segmentation in k means clustering after 30 s of spraying. The chosen spraying configuration presented a quite good distribution for the entire hedge height with the exclusion of the lower (0-1 m from the ground) and the upper part (>1.9 m). Through the image segmentation performed of ΔT image by k-means clustering, it was possible to have a more detailed and visual appreciation of the distribution quality among the entire hedge. The thermoimage analysis revealed interesting potentiality to evaluate quality distribution from orchards sprayers.

  3. Reliability analysis of structural ceramics subjected to biaxial flexure

    SciTech Connect

    Chao, Luen-Yuan.

    1993-01-01

    Two weakest-link fracture statistics formulations for multiaxial loading, Batdorf's flaw density and orientation distribution approach and Evans' elemental strength approach, were compared for identical fracture criteria and flaw-size distribution function. Despite some fundamental differences in the methodology used in calculating fracture probabilities for multiaxial loading, the two approaches gave identical predictions. A recent contradictory conclusion reported in the literature is shown to be incorrect. Fracture stresses of a sintered alumina and silicon nitride were assessed in qualified uniaxial (three-point and four-point) and biaxial (uniform-pressure-on-disk) flexure tests in inert conditions. The size and stress-state effects on the inert fracture stress of alumina were explained by a reliability analysis based on randomly oriented surface flaws and a mixed-mode fracture criterion. Fracture stresses of silicon nitride were in accord with a reliability analysis based on volume flaws with preferred orientation (crack plane normal to the maximum principal stress) and a normal stress fracture criterion. The preferred orientation of the flaws in silicon nitride resulted from stress-induced nucleation of cracks around pores. Alumina ceramic was also tested in deionized water at a low stressing rate (1 MPa/s). The decreased fracture stresses measured in both uniaxial and biaxial flexure tests in water as compared to the inert fracture stresses were consistent with subcritical crack growth behavior inferred from dynamic fatigue tests in water. The analysis of the size and stress-state effects on and time-dependent degradation of fracture stresses included consideration of the statistical uncertainties (90 percent confidence intervals) of the estimated Weibull (Weibull modulus, m, and characteristic strength, sigma[sub theta]) and slow-crack-growth (stress-intensity exponent, N, and critical crack growth rate, V[sub C]) parameters.

  4. Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline.

    PubMed

    Dinov, Ivo D; Van Horn, John D; Lozev, Kamen M; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; Mackenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S; Toga, Arthur W

    2009-01-01

    The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications

  5. Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo D.; Van Horn, John D.; Lozev, Kamen M.; Magsipoc, Rico; Petrosyan, Petros; Liu, Zhizhong; MacKenzie-Graham, Allan; Eggert, Paul; Parker, Douglas S.; Toga, Arthur W.

    2009-01-01

    The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications

  6. Age Dating Fluvial Sediment Storage Reservoirs to Construct Sediment Waiting Time Distributions

    NASA Astrophysics Data System (ADS)

    Skalak, K.; Pizzuto, J. E.; Benthem, A.; Karwan, D. L.; Mahan, S.

    2015-12-01

    Suspended sediment transport is an important geomorphic process that can often control the transport of nutrients and contaminants. The time a particle spends in storage remains a critical knowledge gap in understanding particle trajectories through landscapes. We dated floodplain deposits in South River, VA, using fallout radionuclides (Pb-210, Cs-137), optically stimulated luminescence (OSL), and radiocarbon dating to determine sediment ages and construct sediment waiting time distributions. We have a total of 14 age dates in two eroding banks. We combine these age dates with a well-constrained history of mercury concentrations on suspended sediment in the river from an industrial release. Ages from fallout radionuclides document sedimentation from the early 1900s to the present, and agree with the history of mercury contamination. OSL dates span approximately 200 to 17,000 years old. We performed a standard Weibull analysis of nonexceedance to construct a waiting time distribution of floodplain sediment for the South River. The mean waiting time for floodplain sediment is 2930 years, while the median is approximately 710 years. When the floodplain waiting time distribution is combined with the waiting time distribution for in-channel sediment storage (available from previous studies), the mean waiting time shifts to approximately 680 years, suggesting that quantifying sediment waiting times for both channel and floodplain storage is critical in advancing knowledge of particle trajectories through watersheds.

  7. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  8. Long-term mechanical life testing of polymeric post insulators for distribution and a comparison to porcelain

    SciTech Connect

    Cherney, E.A. )

    1988-07-01

    The paper presents the results and analyses of long-term cantilever strength tests on polymeric line post insulators. The time-to-failure data for static cantilever loads are represented by the Weibull distribution. The life distribution, obtained from the maximum likelihood estimates of the accelerated failure times, fits an exponential model. An extrapolation of the life distribution to normal loads provides an estimate of the strength rating and mechanical equivalence to porcelain line post insulators.

  9. Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

    SciTech Connect

    Gaite, José

    2010-03-01

    We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions.

  10. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  11. Analysis of variance of communication latencies in anesthesia: comparing means of multiple log-normal distributions.

    PubMed

    Ledolter, Johannes; Dexter, Franklin; Epstein, Richard H

    2011-10-01

    Anesthesiologists rely on communication over periods of minutes. The analysis of latencies between when messages are sent and responses obtained is an essential component of practical and regulatory assessment of clinical and managerial decision-support systems. Latency data including times for anesthesia providers to respond to messages have moderate (> n = 20) sample sizes, large coefficients of variation (e.g., 0.60 to 2.50), and heterogeneous coefficients of variation among groups. Highly inaccurate results are obtained both by performing analysis of variance (ANOVA) in the time scale or by performing it in the log scale and then taking the exponential of the result. To overcome these difficulties, one can perform calculation of P values and confidence intervals for mean latencies based on log-normal distributions using generalized pivotal methods. In addition, fixed-effects 2-way ANOVAs can be extended to the comparison of means of log-normal distributions. Pivotal inference does not assume that the coefficients of variation of the studied log-normal distributions are the same, and can be used to assess the proportional effects of 2 factors and their interaction. Latency data can also include a human behavioral component (e.g., complete other activity first), resulting in a bimodal distribution in the log-domain (i.e., a mixture of distributions). An ANOVA can be performed on a homogeneous segment of the data, followed by a single group analysis applied to all or portions of the data using a robust method, insensitive to the probability distribution.

  12. Spatial sensitivity analysis of snow cover data in a distributed rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Berezowski, T.; Nossent, J.; Chormański, J.; Batelaan, O.

    2014-10-01

    As the availability of spatially distributed data sets for distributed rainfall-runoff modelling is strongly growing, more attention should be paid to the influence of the quality of the data on the calibration. While a lot of progress has been made on using distributed data in simulations of hydrological models, sensitivity of spatial data with respect to model results is not well understood. In this paper we develop a spatial sensitivity analysis (SA) method for snow cover fraction input data (SCF) for a distributed rainfall-runoff model to investigate if the model is differently subjected to SCF uncertainty in different zones of the model. The analysis was focused on the relation between the SCF sensitivity and the physical, spatial parameters and processes of a distributed rainfall-runoff model. The methodology is tested for the Biebrza River catchment, Poland for which a distributed WetSpa model is setup to simulate two years of daily runoff. The SA uses the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm, which uses different response functions for each 4 km × 4 km snow zone. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different environmental factors such as: geomorphology, soil texture, land-use, precipitation and temperature. Moreover, the spatial pattern of sensitivity under different response functions is related to different spatial parameters and physical processes. The results clearly show that the LH-OAT algorithm is suitable for the spatial sensitivity analysis approach and that the SCF is spatially sensitive in the WetSpa model.

  13. Characterizing fibrosis in UUO mice model using multiparametric analysis of phasor distribution from FLIM images

    PubMed Central

    Ranjit, Suman; Dvornikov, Alexander; Levi, Moshe; Furgeson, Seth; Gratton, Enrico

    2016-01-01

    Phasor approach to fluorescence lifetime microscopy is used to study development of fibrosis in the unilateral ureteral obstruction model (UUO) of kidney in mice. Traditional phasor analysis has been modified to create a multiparametric analysis scheme that splits the phasor points in four equidistance segments based on the height of peak of the phasor distribution and calculates six parameters including average phasor positions, the shape of each segment, the angle of the distribution and the number of points in each segment. These parameters are used to create a spectrum of twenty four points specific to the phasor distribution of each sample. Comparisons of spectra from diseased and healthy tissues result in quantitative separation and calculation of statistical parameters including AUC values, positive prediction values and sensitivity. This is a new method in the evolving field of analyzing phasor distribution of FLIM data and provides further insights. Additionally, the progression of fibrosis with time is detected using this multiparametric approach to phasor analysis. PMID:27699117

  14. Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

    SciTech Connect

    Weitzel, E.; Hoeschele, M.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  15. An approximate marginal logistic distribution for the analysis of longitudinal ordinal data.

    PubMed

    Nooraee, Nazanin; Abegaz, Fentaw; Ormel, Johan; Wit, Ernst; van den Heuvel, Edwin R

    2016-03-01

    Subject-specific and marginal models have been developed for the analysis of longitudinal ordinal data. Subject-specific models often lack a population-average interpretation of the model parameters due to the conditional formulation of random intercepts and slopes. Marginal models frequently lack an underlying distribution for ordinal data, in particular when generalized estimating equations are applied. To overcome these issues, latent variable models underneath the ordinal outcomes with a multivariate logistic distribution can be applied. In this article, we extend the work of O'Brien and Dunson (2004), who studied the multivariate t-distribution with marginal logistic distributions. We use maximum likelihood, instead of a Bayesian approach, and incorporated covariates in the correlation structure, in addition to the mean model. We compared our method with GEE and demonstrated that it performs better than GEE with respect to the fixed effect parameter estimation when the latent variables have an approximately elliptical distribution, and at least as good as GEE for other types of latent variable distributions. PMID:26458164

  16. Mathematical modeling and numerical analysis of thermal distribution in arch dams considering solar radiation effect.

    PubMed

    Mirzabozorg, H; Hariri-Ardebili, M A; Shirkhan, M; Seyed-Kolbadi, S M

    2014-01-01

    The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams.

  17. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    SciTech Connect

    Stewart, Emma; Kiliccote, Sila; McParland, Charles; Roberts, Ciaran

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  18. Mathematical Modeling and Numerical Analysis of Thermal Distribution in Arch Dams considering Solar Radiation Effect

    PubMed Central

    Mirzabozorg, H.; Hariri-Ardebili, M. A.; Shirkhan, M.; Seyed-Kolbadi, S. M.

    2014-01-01

    The effect of solar radiation on thermal distribution in thin high arch dams is investigated. The differential equation governing thermal behavior of mass concrete in three-dimensional space is solved applying appropriate boundary conditions. Solar radiation is implemented considering the dam face direction relative to the sun, the slop relative to horizon, the region cloud cover, and the surrounding topography. It has been observed that solar radiation changes the surface temperature drastically and leads to nonuniform temperature distribution. Solar radiation effects should be considered in thermal transient analysis of thin arch dams. PMID:24695817

  19. Rank-Ordered Multifractal Analysis of Probability Distributions in Fluid Turbulence

    NASA Astrophysics Data System (ADS)

    Wu, Cheng-Chin; Chang, Tien

    2015-11-01

    Rank-Ordered Multifractal Analysis (ROMA) was introduced by Chang and Wu (2008) to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU) turbulence database. In addition, a refined method of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF) simultaneously is introduced.

  20. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    PubMed

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs.

  1. A Meta-Analysis of Distributed Leadership from 2002 to 2013: Theory Development, Empirical Evidence and Future Research Focus

    ERIC Educational Resources Information Center

    Tian, Meng; Risku, Mika; Collin, Kaija

    2016-01-01

    This article provides a meta-analysis of research conducted on distributed leadership from 2002 to 2013. It continues the review of distributed leadership commissioned by the English National College for School Leadership (NCSL) ("Distributed Leadership: A Desk Study," Bennett et al., 2003), which identified two gaps in the research…

  2. SpatTrack: an imaging toolbox for analysis of vesicle motility and distribution in living cells.

    PubMed

    Lund, Frederik W; Jensen, Maria Louise V; Christensen, Tanja; Nielsen, Gitte K; Heegaard, Christian W; Wüstner, Daniel

    2014-12-01

    The endocytic pathway is a complex network of highly dynamic organelles, which has been traditionally studied by quantitative fluorescence microscopy. The data generated by this method can be overwhelming and its analysis, even for the skilled microscopist, is tedious and error-prone. We developed SpatTrack, an open source, platform-independent program collecting a variety of methods for analysis of vesicle dynamics and distribution in living cells. SpatTrack performs 2D particle tracking, trajectory analysis and fitting of diffusion models to the calculated mean square displacement. It allows for spatial analysis of detected vesicle patterns including calculation of the radial distribution function and particle-based colocalization. Importantly, all analysis tools are supported by Monte Carlo simulations of synthetic images. This allows the user to assess the reliability of the analysis and to study alternative scenarios. We demonstrate the functionality of SpatTrack by performing a detailed imaging study of internalized fluorescence-tagged Niemann Pick C2 (NPC2) protein in human disease fibroblasts. Using SpatTrack, we show that NPC2 rescued the cholesterol-storage phenotype from a subpopulation of late endosomes/lysosomes (LE/LYSs). This was paralleled by repositioning and active transport of NPC2-containing vesicles to the cell surface. The potential of SpatTrack for other applications in intracellular transport studies will be discussed.

  3. Principal Effects of Axial Load on Moment-Distribution Analysis of Rigid Structures

    NASA Technical Reports Server (NTRS)

    James, Benjamin Wylie

    1935-01-01

    This thesis presents the method of moment distribution modified to include the effect of axial load upon the bending moments. This modification makes it possible to analyze accurately complex structures, such as rigid fuselage trusses, that heretofore had to be analyzed by approximate formulas and empirical rules. The method is simple enough to be practicable even for complex structures, and it gives a means of analysis for continuous beams that is simpler than the extended three-moment equation now in common use. When the effect of axial load is included, it is found that the basic principles of moment distribution remain unchanged, the only difference being that the factors used, instead of being constants for a given member, become functions of the axial load. Formulas have been developed for these factors, and curves plotted so that their applications requires no more work than moment distribution without axial load. Simple problems have been included to illustrate the use of the curves.

  4. Analysis of electron energy distribution function in the Linac4 H⁻ source.

    PubMed

    Mochizuki, S; Mattei, S; Nishida, K; Hatayama, A; Lettry, J

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H(-) negative ion production by reducing the gas pressure. PMID:26931990

  5. Analysis of electron energy distribution function in the Linac4 H⁻ source.

    PubMed

    Mochizuki, S; Mattei, S; Nishida, K; Hatayama, A; Lettry, J

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H(-) negative ion production by reducing the gas pressure.

  6. Spatial analysis of the distribution of intestinal nematode infections in Uganda.

    PubMed Central

    Brooker, S.; Kabatereine, N. B.; Tukahebwa, E. M.; Kazibwe, F.

    2004-01-01

    The spatial epidemiology of intestinal nematodes in Uganda was investigated using generalized additive models and geostatistical methods. The prevalence of Ascaris lumbricoides and Trichuris trichiura was unevenly distributed in the country with prevalence greatest in southwest Uganda whereas hookworm was more homogeneously distributed. A. lumbricoides and T. Trichiura prevalence were nonlinearly related to satellite sensor-based estimates of land surface temperature; hookworm was nonlinearly associated with rainfall. Semivariogram analysis indicated that T. trichiura prevalence exhibited no spatial structure and that A. lumbricoides exhibited some spatial dependency at small spatial distances, once large-scale, mainly environmental, trends had been removed. In contrast, there was much more spatial structure in hookworm prevalence although the underlying factors are at present unclear. The implications of the results are discussed in relation to parasite spatial epidemiology and the prediction of infection distributions. PMID:15635963

  7. Comparative Study on the Selection Criteria for Fitting Flood Frequency Distribution Models with Emphasis on Upper-Tail Behavior

    NASA Astrophysics Data System (ADS)

    Xiaohong, C.

    2014-12-01

    Many probability distributions have been proposed for flood frequency analysis and several criteria have been used for selecting a best fitted distribution to an observed or generated data set by some random process. The upper tail of flood frequency distribution should be specifically concerned for flood control. However, different model selection criteria often result in different optimal distributions when focus on upper tail of flood frequency distribution. In this study, with emphasis on the upper-tail behavior, 5 distribution selection criteria including 2 hypothesis tests and 3 information-based criteria are evaluated in selecting the best fitted distribution from 8 widely used distributions (Pearson 3, Log-Pearson 3, two-parameter lognormal, three-parameter lognormal, Gumbel, Weibull, Generalized extreme value and Generalized logistic distributions) by using datasets from Thames River (UK), Wabash River (USA), Beijiang River and Huai River (China), which are all within latitude of 23.5-66.5 degrees north. The performance of the 5 selection criteria is verified by using a composite criterion focus on upper tail events defined in this study. This paper shows the approach for the optimal selection of suitable flood frequency distributions for different river basins. Results illustrate that (1) Different distributions are selected by using hypothesis tests and information-based criteria for each river. (2) The information-based criteria perform better than hypothesis tests in most cases when the focus is on the goodness of predictions of the extreme upper tail events. (3) In order to decide on a particular distribution to fit the high flow, it would be better to use the combination criteria, in which the information-based criteria can be used first to rank the models and the results are inspected by hypothesis testing methods. In addition, if the information-based criteria and hypothesis tests provide different results, the composite criterion will be taken for

  8. Development of a Web Service for Analysis in a Distributed Network

    PubMed Central

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  9. Phenotype Clustering of Breast Epithelial Cells in Confocal Imagesbased on Nuclear Protein Distribution Analysis

    SciTech Connect

    Long, Fuhui; Peng, Hanchuan; Sudar, Damir; Levievre, Sophie A.; Knowles, David W.

    2006-09-05

    Background: The distribution of the chromatin-associatedproteins plays a key role in directing nuclear function. Previously, wedeveloped an image-based method to quantify the nuclear distributions ofproteins and showed that these distributions depended on the phenotype ofhuman mammary epithelial cells. Here we describe a method that creates ahierarchical tree of the given cell phenotypes and calculates thestatistical significance between them, based on the clustering analysisof nuclear protein distributions. Results: Nuclear distributions ofnuclear mitotic apparatus protein were previously obtained fornon-neoplastic S1 and malignant T4-2 human mammary epithelial cellscultured for up to 12 days. Cell phenotype was defined as S1 or T4-2 andthe number of days in cultured. A probabilistic ensemble approach wasused to define a set of consensus clusters from the results of multipletraditional cluster analysis techniques applied to the nucleardistribution data. Cluster histograms were constructed to show how cellsin any one phenotype were distributed across the consensus clusters.Grouping various phenotypes allowed us to build phenotype trees andcalculate the statistical difference between each group. The resultsshowed that non-neoplastic S1 cells could be distinguished from malignantT4-2 cells with 94.19 percent accuracy; that proliferating S1 cells couldbe distinguished from differentiated S1 cells with 92.86 percentaccuracy; and showed no significant difference between the variousphenotypes of T4-2 cells corresponding to increasing tumor sizes.Conclusion: This work presents a cluster analysis method that canidentify significant cell phenotypes, based on the nuclear distributionof specific proteins, with high accuracy.

  10. Validation results of the IAG Dancer project for distributed GPS analysis

    NASA Astrophysics Data System (ADS)

    Boomkamp, H.

    2012-12-01

    The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and

  11. A landscape analysis of cougar distribution and abundance in Montana, USA.

    PubMed

    Riley, S J; Malecki, R A

    2001-09-01

    Recent growth in the distribution and abundance of cougars (Puma concolor) throughout western North America has created opportunities, challenges, and problems for wildlife managers and raises questions about what factors affect cougar populations. We present an analysis of factors thought to affect cougar distribution and abundance across the broad geographical scales on which most population management decisions are made. Our objectives were to: (1) identify and evaluate landscape parameters that can be used to predict the capability of habitats to support cougars, and (2) evaluate factors that may account for the recent expansion in cougar numbers. Habitat values based on terrain ruggedness and forested cover explained 73% of the variation in a cougar abundance index. Indices of cougar abundance also were spatially and temporally correlated with ungulate abundance. An increase in the number and total biomass of ungulate prey species is hypothesized to account for recent increases in cougars. Cougar populations in Montana are coping with land development by humans when other components of habitat and prey populations are sufficient. Our analysis provides a better understanding of what may have influenced recent growth in cougar distribution and abundance in Montana and, when combined with insights about stakeholder acceptance capacity, offers a basis for cougar management at broad scales. Long-term conservation of cougars necessitates a better understanding of ecosystem functions that affect prey distribution and abundance, more accurate estimates of cougar populations, and management abilities to integrate these components with human values.

  12. Modeling Exon-Specific Bias Distribution Improves the Analysis of RNA-Seq Data.

    PubMed

    Liu, Xuejun; Zhang, Li; Chen, Songcan

    2015-01-01

    RNA-seq technology has become an important tool for quantifying the gene and transcript expression in transcriptome study. The two major difficulties for the gene and transcript expression quantification are the read mapping ambiguity and the overdispersion of the read distribution along reference sequence. Many approaches have been proposed to deal with these difficulties. A number of existing methods use Poisson distribution to model the read counts and this easily splits the counts into the contributions from multiple transcripts. Meanwhile, various solutions were put forward to account for the overdispersion in the Poisson models. By checking the similarities among the variation patterns of read counts for individual genes, we found that the count variation is exon-specific and has the conserved pattern across the samples for each individual gene. We introduce Gamma-distributed latent variables to model the read sequencing preference for each exon. These variables are embedded to the rate parameter of a Poisson model to account for the overdispersion of read distribution. The model is tractable since the Gamma priors can be integrated out in the maximum likelihood estimation. We evaluate the proposed approach, PGseq, using four real datasets and one simulated dataset, and compare its performance with other popular methods. Results show that PGseq presents competitive performance compared to other alternatives in terms of accuracy in the gene and transcript expression calculation and in the downstream differential expression analysis. Especially, we show the advantage of our method in the analysis of low expression. PMID:26448625

  13. First results and analysis of collective Thomson scattering (CTS) fast ion distribution measurements on ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Meo, F.; Stejner, M.; Salewski, M.; Bindslev, H.; Eich, T.; Furtula, V.; Korsholm, S. B.; Leuterer, F.; Leipold, F.; Michelsen, P. K.; Moseev, D.; Nielsen, S. K.; Reiter, B.; Stober, J.; Wagner, D.; Woskov, P.; ASDEX Upgrade Team

    2010-05-01

    Experimental knowledge of the fast ion physics in magnetically confined plasmas is essential. The collective Thomson scattering (CTS) diagnostic is capable of measuring localized 1D ion velocity distributions and anisotropies dependent on the angle to the magnetic field. The CTS installed at ASDEX-Upgrade (AUG) uses mm-waves generated by the 1 MW dual frequency gyrotron. The successful commissioning the CTS at AUG enabled first scattering experiments and the consequent milestone of first fast ion distribution measurements on AUG presented in this paper. The first fast ion distribution results have already uncovered some physics of confined fast ions at the plasma centre with off-axis neutral beam heating. However, CTS experiments on AUG H-mode plasmas have also uncovered some unexpected signals not related to scattering that required additional analysis and treatment of the data. These secondary emission signals are generated from the plasma-gyrotron interaction therefore contain additional physics. Despite their existence that complicate the fast ion analysis, they do not prevent the diagnostic's capability to infer the fast ion distribution function on AUG.

  14. Sensitivity analysis for large-deflection and postbuckling responses on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Watson, Brian C.; Noor, Ahmed K.

    1995-01-01

    A computational strategy is presented for calculating sensitivity coefficients for the nonlinear large-deflection and postbuckling responses of laminated composite structures on distributed-memory parallel computers. The strategy is applicable to any message-passing distributed computational environment. The key elements of the proposed strategy are: (1) a multiple-parameter reduced basis technique; (2) a parallel sparse equation solver based on a nested dissection (or multilevel substructuring) node ordering scheme; and (3) a multilevel parallel procedure for evaluating hierarchical sensitivity coefficients. The hierarchical sensitivity coefficients measure the sensitivity of the composite structure response to variations in three sets of interrelated parameters; namely, laminate, layer and micromechanical (fiber, matrix, and interface/interphase) parameters. The effectiveness of the strategy is assessed by performing hierarchical sensitivity analysis for the large-deflection and postbuckling responses of stiffened composite panels with cutouts on three distributed-memory computers. The panels are subjected to combined mechanical and thermal loads. The numerical studies presented demonstrate the advantages of the reduced basis technique for hierarchical sensitivity analysis on distributed-memory machines.

  15. Stress distribution on a valgus knee prosthetic inclined interline -- a finite element analysis.

    PubMed

    Orban, H; Stan, G; Gruionu, L; Orban, C

    2013-01-01

    Total knee arthroplasty following valgus deformity is a challenging procedure due to the unique set of problems that must be addressed. The aim of this study is to determine, with a finite element analysis, the load distribution for an inclined valgus prosthetic balanced knee and to compare these results with those of a prosthetic balanced knee with an uninclined interline. Computational simulations, using finite element analysis, focused on a comparision between load intensity and distribution for these situations. We studied valgus inclination at 3 and 8 degrees. We noticed that for an inclination of 3 degrees, the forces are distributed almost symmetrically on both condyles, similar to the distribution of forces in the uninclined interline case. The maximum contact pressure is greater, increasing from 15 MPa to 19.3 MPa (28%). At 8 degrees of inclination, the contact patch moved anterolateraly on the tibia, meaning that the tibial condyles will be unequally loaded. The maximum contact pressure increases to 25 MPa (66%). These greater forces could lead to polyethylene wear and collapse. Additional tibial resection could be a useful method for balancing in severe valgus knee, when valgus inlination does not exceed 3 degrees. PMID:23464776

  16. Modeling Exon-Specific Bias Distribution Improves the Analysis of RNA-Seq Data

    PubMed Central

    Liu, Xuejun; Zhang, Li; Chen, Songcan

    2015-01-01

    RNA-seq technology has become an important tool for quantifying the gene and transcript expression in transcriptome study. The two major difficulties for the gene and transcript expression quantification are the read mapping ambiguity and the overdispersion of the read distribution along reference sequence. Many approaches have been proposed to deal with these difficulties. A number of existing methods use Poisson distribution to model the read counts and this easily splits the counts into the contributions from multiple transcripts. Meanwhile, various solutions were put forward to account for the overdispersion in the Poisson models. By checking the similarities among the variation patterns of read counts for individual genes, we found that the count variation is exon-specific and has the conserved pattern across the samples for each individual gene. We introduce Gamma-distributed latent variables to model the read sequencing preference for each exon. These variables are embedded to the rate parameter of a Poisson model to account for the overdispersion of read distribution. The model is tractable since the Gamma priors can be integrated out in the maximum likelihood estimation. We evaluate the proposed approach, PGseq, using four real datasets and one simulated dataset, and compare its performance with other popular methods. Results show that PGseq presents competitive performance compared to other alternatives in terms of accuracy in the gene and transcript expression calculation and in the downstream differential expression analysis. Especially, we show the advantage of our method in the analysis of low expression. PMID:26448625

  17. Impact of hadronic and nuclear corrections on global analysis of spin-dependent parton distributions

    SciTech Connect

    Jimenez-Delgado, Pedro; Accardi, Alberto; Melnitchouk, Wally

    2014-02-01

    We present the first results of a new global next-to-leading order analysis of spin-dependent parton distribution functions from the most recent world data on inclusive polarized deep-inelastic scattering, focusing in particular on the large-x and low-Q^2 regions. By directly fitting polarization asymmetries we eliminate biases introduced by using polarized structure function data extracted under nonuniform assumptions for the unpolarized structure functions. For analysis of the large-x data we implement nuclear smearing corrections for deuterium and 3He nuclei, and systematically include target mass and higher twist corrections to the g_1 and g_2 structure functions at low Q^2. We also explore the effects of Q^2 and W^2 cuts in the data sets, and the potential impact of future data on the behavior of the spin-dependent parton distributions at large x.

  18. Identifying synonymy between SNOMED clinical terms of varying length using distributional analysis of electronic health records.

    PubMed

    Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W

    2013-01-01

    Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records - the MIMIC-II database - can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length.

  19. Identifying Synonymy between SNOMED Clinical Terms of Varying Length Using Distributional Analysis of Electronic Health Records

    PubMed Central

    Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

    2013-01-01

    Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

  20. Risk analysis of highly combustible gas storage, supply, and distribution systems in PWR plants

    SciTech Connect

    Simion, G.P.; VanHorn, R.L.; Smith, C.L.; Bickel, J.H.; Sattison, M.B.; Bulmahn, K.D.

    1993-06-01

    This report presents the evaluation of the potential safety concerns for pressurized water reactors (PWRs) identified in Generic Safety Issue 106, Piping and the Use of Highly Combustible Gases in Vital Areas. A Westinghouse four-loop PWR plant was analyzed for the risk due to the use of combustible gases (predominantly hydrogen) within the plant. The analysis evaluated an actual hydrogen distribution configuration and conducted several sensitivity studies to determine the potential variability among PWRs. The sensitivity studies were based on hydrogen and safety-related equipment configurations observed at other PWRs within the United States. Several options for improving the hydrogen distribution system design were identified and evaluated for their effect on risk and core damage frequency. A cost/benefit analysis was performed to determine whether alternatives considered were justifiable based on the safety improvement and economics of each possible improvement.

  1. Principal Components Analysis on the spectral Bidirectional Reflectance Distribution Function of ceramic colour standards.

    PubMed

    Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A

    2011-09-26

    The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample.

  2. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  3. Systematic analysis of mutation distribution in three dimensional protein structures identifies cancer driver genes

    PubMed Central

    Fujimoto, Akihiro; Okada, Yukinori; Boroevich, Keith A.; Tsunoda, Tatsuhiko; Taniguchi, Hiroaki; Nakagawa, Hidewaki

    2016-01-01

    Protein tertiary structure determines molecular function, interaction, and stability of the protein, therefore distribution of mutation in the tertiary structure can facilitate the identification of new driver genes in cancer. To analyze mutation distribution in protein tertiary structures, we applied a novel three dimensional permutation test to the mutation positions. We analyzed somatic mutation datasets of 21 types of cancers obtained from exome sequencing conducted by the TCGA project. Of the 3,622 genes that had ≥3 mutations in the regions with tertiary structure data, 106 genes showed significant skew in mutation distribution. Known tumor suppressors and oncogenes were significantly enriched in these identified cancer gene sets. Physical distances between mutations in known oncogenes were significantly smaller than those of tumor suppressors. Twenty-three genes were detected in multiple cancers. Candidate genes with significant skew of the 3D mutation distribution included kinases (MAPK1, EPHA5, ERBB3, and ERBB4), an apoptosis related gene (APP), an RNA splicing factor (SF1), a miRNA processing factor (DICER1), an E3 ubiquitin ligase (CUL1) and transcription factors (KLF5 and EEF1B2). Our study suggests that systematic analysis of mutation distribution in the tertiary protein structure can help identify cancer driver genes. PMID:27225414

  4. Systematic analysis of mutation distribution in three dimensional protein structures identifies cancer driver genes.

    PubMed

    Fujimoto, Akihiro; Okada, Yukinori; Boroevich, Keith A; Tsunoda, Tatsuhiko; Taniguchi, Hiroaki; Nakagawa, Hidewaki

    2016-01-01

    Protein tertiary structure determines molecular function, interaction, and stability of the protein, therefore distribution of mutation in the tertiary structure can facilitate the identification of new driver genes in cancer. To analyze mutation distribution in protein tertiary structures, we applied a novel three dimensional permutation test to the mutation positions. We analyzed somatic mutation datasets of 21 types of cancers obtained from exome sequencing conducted by the TCGA project. Of the 3,622 genes that had ≥3 mutations in the regions with tertiary structure data, 106 genes showed significant skew in mutation distribution. Known tumor suppressors and oncogenes were significantly enriched in these identified cancer gene sets. Physical distances between mutations in known oncogenes were significantly smaller than those of tumor suppressors. Twenty-three genes were detected in multiple cancers. Candidate genes with significant skew of the 3D mutation distribution included kinases (MAPK1, EPHA5, ERBB3, and ERBB4), an apoptosis related gene (APP), an RNA splicing factor (SF1), a miRNA processing factor (DICER1), an E3 ubiquitin ligase (CUL1) and transcription factors (KLF5 and EEF1B2). Our study suggests that systematic analysis of mutation distribution in the tertiary protein structure can help identify cancer driver genes. PMID:27225414

  5. An exploratory spatial analysis of soil organic carbon distribution in Canadian eco-regions

    NASA Astrophysics Data System (ADS)

    Tan, S.-Y.; Li, J.

    2014-11-01

    As the largest carbon reservoir in ecosystems, soil accounts for more than twice as much carbon storage as that of vegetation biomass or the atmosphere. This paper examines spatial patterns of soil organic carbon (SOC) in Canadian forest areas at an eco-region scale of analysis. The goal is to explore the relationship of SOC levels with various climatological variables, including temperature and precipitation. The first Canadian forest soil database published in 1997 by the Canada Forest Service was analyzed along with other long-term eco-climatic data (1961 to 1991) including precipitation, air temperature, slope, aspect, elevation, and Normalized Difference Vegetation Index (NDVI) derived from remote sensing imagery. In addition, the existing eco-region framework established by Environment Canada was evaluated for mapping SOC distribution. Exploratory spatial data analysis techniques, including spatial autocorrelation analysis, were employed to examine how forest SOC is spatially distributed in Canada. Correlation analysis and spatial regression modelling were applied to determine the dominant ecological factors influencing SOC patterns at the eco-region level. At the national scale, a spatial error regression model was developed to account for spatial dependency and to estimate SOC patterns based on ecological and ecosystem factors. Based on the significant variables derived from the spatial error model, a predictive SOC map in Canadian forest areas was generated. Although overall SOC distribution is influenced by climatic and topographic variables, distribution patterns are shown to differ significantly between eco-regions. These findings help to validate the eco-region classification framework for SOC zonation mapping in Canada.

  6. Contain analysis of hydrogen distribution and combustion in PWR dry containments

    SciTech Connect

    Yang, J.W.; Nimnual, S.

    1991-01-01

    Hydrogen transport and combustion in a PWR dry containment are analyzed using the CONTAIN code for a multi-compartment model of the Zion plant. The analysis includes consideration of both degraded core and full core meltdown accidents initiated by a small break LOCA. The importance of intercell flow mixing on distributions of gas composition and temperature in various compartments are evaluated. Thermal stratification and combustion behavior are discussed. 4 refs., 8 figs., 2 tabs.

  7. Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.

    PubMed

    Zhang, Ziyi; Bao, Xiaoyi

    2008-07-01

    A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.

  8. Some physics and system issues in the security analysis of quantum key distribution protocols

    NASA Astrophysics Data System (ADS)

    Yuen, Horace P.

    2014-10-01

    In this paper, we review a number of issues on the security of quantum key distribution (QKD) protocols that bear directly on the relevant physics or mathematical representation of the QKD cryptosystem. It is shown that the cryptosystem representation itself may miss out many possible attacks, which are not accounted for in the security analysis and proofs. Hence, the final security claims drawn from such analysis are not reliable, apart from foundational issues about the security criteria that are discussed elsewhere. The cases of continuous-variable QKD and multi-photon sources are elaborated upon.

  9. GRID Processing and analysis of ALICE data at distributed Russian Tier2 centre - RDIG

    NASA Astrophysics Data System (ADS)

    Bogdanov, A.; Jancurova, L.; Kiryanov, A.; Kotlyar, V.; Mitsyn, V.; Lyublev, Y.; Ryabinkin, E.; Shabratova, G.; Stepanova, L.; Tikhomirov, V.; Trofimov, V.; Urazmetov, W.; Utkin, D.; Zarochentsev, A.; Zotkin, S.

    2010-04-01

    The major subject of this paper is the presentation of the distributed computing status report for the ALICE experiment at Russian sites just before the data taking at the Large Hadron Collider in CERN. We present the usage of the ALICE application software, AliEn[1], at the top of the modern EGEE middleware called gLite for the simulation and data analysis in the experiment at the Russian Tier2 in accordance with the ALICE computing model [2]. We outline the results of CPU and disk space usage at RDIG sites for the data simulation and analysis of first LHC data from the exposition of ALICE detector.

  10. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    NASA Astrophysics Data System (ADS)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  11. Single-phase power distribution system power flow and fault analysis

    NASA Technical Reports Server (NTRS)

    Halpin, S. M.; Grigsby, L. L.

    1992-01-01

    Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

  12. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  13. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  14. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  15. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing.

    PubMed

    Rocha, Armando Freitas da; Foz, Flávia Benevides; Pereira, Alfredo

    2015-01-01

    Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (s i ) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(e i ) provided by each electrode of the 10/20 system about the identified s i . H(e i ) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources s i . This analysis evidenced 4 different patterns of H(e i ) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies. PMID:26713089

  16. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing

    PubMed Central

    da Rocha, Armando Freitas; Foz, Flávia Benevides; Pereira, Alfredo

    2015-01-01

    Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (si) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(ei) provided by each electrode of the 10/20 system about the identified si. H(ei) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources si. This analysis evidenced 4 different patterns of H(ei) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies. PMID:26713089

  17. Mechanical Response of Silk Crystalline Units from Force-Distribution Analysis

    PubMed Central

    Xiao, Senbo; Stacklies, Wolfram; Cetinkaya, Murat; Markert, Bernd; Gräter, Frauke

    2009-01-01

    The outstanding mechanical toughness of silk fibers is thought to be caused by embedded crystalline units acting as cross links of silk proteins in the fiber. Here, we examine the robustness of these highly ordered β-sheet structures by molecular dynamics simulations and finite element analysis. Structural parameters and stress-strain relationships of four different models, from spider and Bombyx mori silk peptides, in antiparallel and parallel arrangement, were determined and found to be in good agreement with x-ray diffraction data. Rupture forces exceed those of any previously examined globular protein many times over, with spider silk (poly-alanine) slightly outperforming Bombyx mori silk ((Gly-Ala)n). All-atom force distribution analysis reveals both intrasheet hydrogen-bonding and intersheet side-chain interactions to contribute to stability to similar extent. In combination with finite element analysis of simplified β-sheet skeletons, we could ascribe the distinct force distribution pattern of the antiparallel and parallel silk crystalline units to the difference in hydrogen-bond geometry, featuring an in-line or zigzag arrangement, respectively. Hydrogen-bond strength was higher in antiparallel models, and ultimately resulted in higher stiffness of the crystal, compensating the effect of the mechanically disadvantageous in-line hydrogen-bond geometry. Atomistic and coarse-grained force distribution patterns can thus explain differences in mechanical response of silk crystals, opening up the road to predict full fiber mechanics. PMID:19450471

  18. [Soil Heavy Metal Spatial Distribution and Source Analysis Around an Aluminum Plant in Baotou].

    PubMed

    Zhang, Lian-ke; Li, Hai-peng; Huang, Xue-min; Li, Yu-mei; Jiao, Kun-ling; Sun, Peng; Wang, Wei-da

    2016-03-15

    The soil with 500 m distance from an aluminum plant in Baotou was studied. A total of 64 soil samples were taken from the 0-5 cm, 5-20 cm, 20-40 cm and 40-60 cm layers, and the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn were tested, respectively. The correlation analysis and principal component analysis were used to identify the sources of these heavy metals in soils. The results suggested that the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn in study area were 32.9, 50.35, 69.92, 43.78, 0.54, 554.42 and 36.65 mg · kg⁻¹ respectively. All seven heavy metals tested were overweight compared with the background values of soil in Inner Mongolia. The spatial distribution of heavy metals showed that the horizontal distribution of heavy metals was obviously enriched in the southwest, while in vertical distribution, the heavy metal content (0 to 5 cm) was highest in the surface soil, and the heavy metal content decreased with increasing depth and tended to be stabilized when the depth was over 20 cm. Source analysis showed that the source of Cu, Zn, Cr and Mn might be influenced by the aluminum plant and the surrounding industrial activity. The source of Pb and Cd might be mainly related to road transportation. The source of Ni may be affected by agricultural activities and soil parent material together. PMID:27337911

  19. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  20. Space station electrical power distribution analysis using a load flow approach

    NASA Technical Reports Server (NTRS)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  1. [Soil Heavy Metal Spatial Distribution and Source Analysis Around an Aluminum Plant in Baotou].

    PubMed

    Zhang, Lian-ke; Li, Hai-peng; Huang, Xue-min; Li, Yu-mei; Jiao, Kun-ling; Sun, Peng; Wang, Wei-da

    2016-03-15

    The soil with 500 m distance from an aluminum plant in Baotou was studied. A total of 64 soil samples were taken from the 0-5 cm, 5-20 cm, 20-40 cm and 40-60 cm layers, and the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn were tested, respectively. The correlation analysis and principal component analysis were used to identify the sources of these heavy metals in soils. The results suggested that the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn in study area were 32.9, 50.35, 69.92, 43.78, 0.54, 554.42 and 36.65 mg · kg⁻¹ respectively. All seven heavy metals tested were overweight compared with the background values of soil in Inner Mongolia. The spatial distribution of heavy metals showed that the horizontal distribution of heavy metals was obviously enriched in the southwest, while in vertical distribution, the heavy metal content (0 to 5 cm) was highest in the surface soil, and the heavy metal content decreased with increasing depth and tended to be stabilized when the depth was over 20 cm. Source analysis showed that the source of Cu, Zn, Cr and Mn might be influenced by the aluminum plant and the surrounding industrial activity. The source of Pb and Cd might be mainly related to road transportation. The source of Ni may be affected by agricultural activities and soil parent material together.

  2. Assessment of Altered 3D Blood Characteristics in Aortic Disease by Velocity Distribution Analysis

    PubMed Central

    Garcia, Julio; Barker, Alex J; van Ooij, Pim; Schnell, Susanne; Puthumana, Jyothy; Bonow, Robert O; Collins, Jeremy D; Carr, James C; Markl, Michael

    2014-01-01

    Purpose To test the feasibility of velocity distribution analysis for identifying altered 3D flow characteristics in patients with aortic disease based on 4D flow MRI volumetric analysis. Methods Forty patients with aortic (Ao) dilation (mid ascending aortic diameter MAA=40±7 mm, age=56±17 yr, 11 females) underwent cardiovascular MRI. Four groups were retrospectively defined: mild Ao dilation (n=10, MAA<35 mm); moderate Ao dilation (n=10, 3545 mm); Ao dilation+aortic stenosis AS (n=10, MAA>35 mm and peak velocity >2.5m/s). 3D PC-MR angiograms were computed and used to obtain a 3D segmentation of the aorta which was divided into four segments: root, ascending aorta, arch, descending aorta. Radial chart displays were used to visualize multiple parameters representing segmental changes in the 3D velocity distribution associated with aortic disease. Results Changes in the velocity field and geometry between cohorts resulted in distinct hemodynamic patterns for each aortic segment. Disease progression from mild to Ao dilation+AS resulted in significant differences (P<0.05) in flow parameters across cohorts and increased radial chart size for root and ascending aorta segments by 146% and 99%, respectively. Conclusion Volumetric 4D velocity distribution analysis has the potential to identify characteristic changes in regional blood flow patterns in patients with aortic disease. PMID:25252029

  3. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    SciTech Connect

    Lagerlöf, Jakob H.; Kindblom, Jon; Bernhardt, Peter

    2014-09-15

    lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO{sub 2} (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO{sub 2} (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO{sub 2} (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. Conclusions: A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.

  4. A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules

    NASA Astrophysics Data System (ADS)

    Stauch, Tim; Dreuw, Andreas

    2014-04-01

    The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.

  5. A quantitative quantum-chemical analysis tool for the distribution of mechanical force in molecules

    SciTech Connect

    Stauch, Tim; Dreuw, Andreas

    2014-04-07

    The promising field of mechanochemistry suffers from a general lack of understanding of the distribution and propagation of force in a stretched molecule, which limits its applicability up to the present day. In this article, we introduce the JEDI (Judgement of Energy DIstribution) analysis, which is the first quantum chemical method that provides a quantitative understanding of the distribution of mechanical stress energy among all degrees of freedom in a molecule. The method is carried out on the basis of static or dynamic calculations under the influence of an external force and makes use of a Hessian matrix in redundant internal coordinates (bond lengths, bond angles, and dihedral angles), so that all relevant degrees of freedom of a molecule are included and mechanochemical processes can be interpreted in a chemically intuitive way. The JEDI method is characterized by its modest computational effort, with the calculation of the Hessian being the rate-determining step, and delivers, except for the harmonic approximation, exact ab initio results. We apply the JEDI analysis to several example molecules in both static quantum chemical calculations and Born-Oppenheimer Molecular Dynamics simulations in which molecules are subject to an external force, thus studying not only the distribution and the propagation of strain in mechanically deformed systems, but also gaining valuable insights into the mechanochemically induced isomerization of trans-3,4-dimethylcyclobutene to trans,trans-2,4-hexadiene. The JEDI analysis can potentially be used in the discussion of sonochemical reactions, molecular motors, mechanophores, and photoswitches as well as in the development of molecular force probes.

  6. Nanomaterial size distribution analysis via liquid nebulization coupled with ion mobility spectrometry (LN-IMS).

    PubMed

    Jeon, Seongho; Oberreit, Derek R; Van Schooneveld, Gary; Hogan, Christopher J

    2016-02-21

    We apply liquid nebulization (LN) in series with ion mobility spectrometry (IMS, using a differential mobility analyzer coupled to a condensation particle counter) to measure the size distribution functions (the number concentration per unit log diameter) of gold nanospheres in the 5-30 nm range, 70 nm × 11.7 nm gold nanorods, and albumin proteins originally in aqueous suspensions. In prior studies, IMS measurements have only been carried out for colloidal nanoparticles in this size range using electrosprays for aerosolization, as traditional nebulizers produce supermicrometer droplets which leave residue particles from non-volatile species. Residue particles mask the size distribution of the particles of interest. Uniquely, the LN employed in this study uses both online dilution (with dilution factors of up to 10(4)) with ultra-high purity water and a ball-impactor to remove droplets larger than 500 nm in diameter. This combination enables hydrosol-to-aerosol conversion preserving the size and morphology of particles, and also enables higher non-volatile residue tolerance than electrospray based aerosolization. Through LN-IMS measurements we show that the size distribution functions of narrowly distributed but similarly sized particles can be distinguished from one another, which is not possible with Nanoparticle Tracking Analysis in the sub-30 nm size range. Through comparison to electron microscopy measurements, we find that the size distribution functions inferred via LN-IMS measurements correspond to the particle sizes coated by surfactants, i.e. as they persist in colloidal suspensions. Finally, we show that the gas phase particle concentrations inferred from IMS size distribution functions are functions of only of the liquid phase particle concentration, and are independent of particle size, shape, and chemical composition. Therefore LN-IMS enables characterization of the size, yield, and polydispersity of sub-30 nm particles.

  7. Nanomaterial size distribution analysis via liquid nebulization coupled with ion mobility spectrometry (LN-IMS).

    PubMed

    Jeon, Seongho; Oberreit, Derek R; Van Schooneveld, Gary; Hogan, Christopher J

    2016-02-21

    We apply liquid nebulization (LN) in series with ion mobility spectrometry (IMS, using a differential mobility analyzer coupled to a condensation particle counter) to measure the size distribution functions (the number concentration per unit log diameter) of gold nanospheres in the 5-30 nm range, 70 nm × 11.7 nm gold nanorods, and albumin proteins originally in aqueous suspensions. In prior studies, IMS measurements have only been carried out for colloidal nanoparticles in this size range using electrosprays for aerosolization, as traditional nebulizers produce supermicrometer droplets which leave residue particles from non-volatile species. Residue particles mask the size distribution of the particles of interest. Uniquely, the LN employed in this study uses both online dilution (with dilution factors of up to 10(4)) with ultra-high purity water and a ball-impactor to remove droplets larger than 500 nm in diameter. This combination enables hydrosol-to-aerosol conversion preserving the size and morphology of particles, and also enables higher non-volatile residue tolerance than electrospray based aerosolization. Through LN-IMS measurements we show that the size distribution functions of narrowly distributed but similarly sized particles can be distinguished from one another, which is not possible with Nanoparticle Tracking Analysis in the sub-30 nm size range. Through comparison to electron microscopy measurements, we find that the size distribution functions inferred via LN-IMS measurements correspond to the particle sizes coated by surfactants, i.e. as they persist in colloidal suspensions. Finally, we show that the gas phase particle concentrations inferred from IMS size distribution functions are functions of only of the liquid phase particle concentration, and are independent of particle size, shape, and chemical composition. Therefore LN-IMS enables characterization of the size, yield, and polydispersity of sub-30 nm particles. PMID:26750519

  8. Laws prohibiting peer distribution of injecting equipment in Australia: A critical analysis of their effects.

    PubMed

    Lancaster, Kari; Seear, Kate; Treloar, Carla

    2015-12-01

    The law is a key site for the production of meanings around the 'problem' of drugs in public discourse. In this article, we critically consider the material-discursive 'effects' of laws prohibiting peer distribution of needles and syringes in Australia. Taking the laws and regulations governing possession and distribution of injecting equipment in one jurisdiction (New South Wales, Australia) as a case study, we use Carol Bacchi's poststructuralist approach to policy analysis to critically consider the assumptions and presuppositions underpinning this legislative and regulatory framework, with a particular focus on examining the discursive, subjectification and lived effects of these laws. We argue that legislative prohibitions on the distribution of injecting equipment except by 'authorised persons' within 'approved programs' constitute people who inject drugs as irresponsible, irrational, and untrustworthy and re-inscribe a familiar stereotype of the drug 'addict'. These constructions of people who inject drugs fundamentally constrain how the provision of injecting equipment may be thought about in policy and practice. We suggest that prohibitions on the distribution of injecting equipment among peers may also have other, material, effects and may be counterproductive to various public health aims and objectives. However, the actions undertaken by some people who inject drugs to distribute equipment to their peers may disrupt and challenge these constructions, through a counter-discourse in which people who inject drugs are constituted as active agents with a vital role to play in blood-borne virus prevention in the community. Such activity continues to bring with it the risk of criminal prosecution, and so it remains a vexed issue. These insights have implications of relevance beyond Australia, particularly for other countries around the world that prohibit peer distribution, but also for other legislative practices with material-discursive effects in

  9. Structure analysis and size distribution of particulate matter from candles and kerosene combustion in burning chamber

    NASA Astrophysics Data System (ADS)

    Baitimirova, M.; Osite, A.; Katkevics, J.; Viksna, A.

    2012-08-01

    Burning of candles generates particulate matter of fine dimensions that produces poor indoor air quality, so it may cause harmful impact on human health. In this study solid aerosol particles of burning of candles of different composition and kerosene combustion were collected in a closed laboratory system. Present work describes particulate matter collection for structure analysis and the relationship between source and size distribution of particulate matter. The formation mechanism of particulate matter and their tendency to agglomerate also are described. Particles obtained from kerosene combustion have normal size distribution. Whereas, particles generated from the burning of stearin candles have distribution shifted towards finer particle size range. If an additive of stearin to paraffin candle is used, particle size distribution is also observed in range of towards finer particles. A tendency to form agglomerates in a short time is observed in case of particles obtained from kerosene combustion, while in case of particles obtained from burning of candles of different composition such a tendency is not observed. Particles from candles and kerosene combustion are Aitken and accumulation mode particles

  10. Inverse analysis of non-uniform temperature distributions using multispectral pyrometry

    NASA Astrophysics Data System (ADS)

    Fu, Tairan; Duan, Minghao; Tian, Jibin; Shi, Congling

    2016-05-01

    Optical diagnostics can be used to obtain sub-pixel temperature information in remote sensing. A multispectral pyrometry method was developed using multiple spectral radiation intensities to deduce the temperature area distribution in the measurement region. The method transforms a spot multispectral pyrometer with a fixed field of view into a pyrometer with enhanced spatial resolution that can give sub-pixel temperature information from a "one pixel" measurement region. A temperature area fraction function was defined to represent the spatial temperature distribution in the measurement region. The method is illustrated by simulations of a multispectral pyrometer with a spectral range of 8.0-13.0 μm measuring a non-isothermal region with a temperature range of 500-800 K in the spot pyrometer field of view. The inverse algorithm for the sub-pixel temperature distribution (temperature area fractions) in the "one pixel" verifies this multispectral pyrometry method. The results show that an improved Levenberg-Marquardt algorithm is effective for this ill-posed inverse problem with relative errors in the temperature area fractions of (-3%, 3%) for most of the temperatures. The analysis provides a valuable reference for the use of spot multispectral pyrometers for sub-pixel temperature distributions in remote sensing measurements.

  11. Performance Analysis of Radial Distribution Systems with UPQC and D-STATCOM

    NASA Astrophysics Data System (ADS)

    Gupta, Atma Ram; Kumar, Ashwani

    2016-08-01

    This paper presents an effective method for finding optimum location of unified power quality conditioner (UPQC) and distributed static compensator (D-STATCOM) in radial distribution system. The bus having the minimum losses is selected as the candidate bus for UPQC placement and the optimal location of D-STATCOM is found by power loss index (PLI) method. The PLI values of all the buses are calculated and the bus having the highest PLI value is the most favorable bus and thus selected as candidate bus for D-STATCOM placement. Main contribution of this paper are: (i) finding optimum location of UPQC in radial distribution system (RDS) based on minimum power loss; (ii) finding the optimal size of UPQC which offers minimum losses; (iii) calculation of annual energy saving using UPQC and D-STATCOM; (iv) cost analysis with and without UPQC and D-STATCOM placement; and (v) comparison of results with and without UPQC and D-STATCOM placement in RDS. The algorithm is tested on IEEE 33-bus and 69-bus radial distribution systems by using MATLAB software.

  12. Analysis of crater distribution in mare units on the lunar far side

    NASA Technical Reports Server (NTRS)

    Walker, A. S.; El-Baz, F.

    1982-01-01

    Mare material is asymmetrically distributed on the moon. The earth-facing hemisphere, where the crust is believed to be 26 km thinner than on the farside, contains substantially more basaltic mare material. Using Lunar Topographic Orthophoto Maps, the thickness of the mare material in three farside craters, Aitken (0.59 km), Isaev (1.0 km), and Tsiolkovskiy (1.75 km) was calculated. Crater frequency distribution in five farside mare units (Aitken, Isaev, Lacus Solitudinis, Langemak, and Tsiolkovskiy) and one light plains unit (in Mendeleev) were also studied. Nearly 10,000 farside craters were counted. Analysis of the crater frequency on the light plains unit gives an age of 4.3 billion yr. Crater frequency distributions on the mare units indicate ages of 3.7 and 3.8 billion yr. suggesting that the units are distributed over a narrow time period of approximately 100 million yr. Returned lunar samples from nearside maria give dates as young as 3.1 billion yr. The results of this study suggest that mare basalt emplacement on the far side ceased before it did on the near side.

  13. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    PubMed

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well. PMID:25838060

  14. Analysis of the Effects of Streamwise Lift Distribution on Sonic Boom Signature

    NASA Technical Reports Server (NTRS)

    Yoo, Paul

    2013-01-01

    Investigation of sonic boom has been one of the major areas of study in aeronautics due to the benefits a low-boom aircraft has in both civilian and military applications. This work conducts a numerical analysis of the effects of streamwise lift distribution on the shock coalescence characteristics. A simple wing-canard-stabilator body model is used in the numerical simulation. The streamwise lift distribution is varied by fixing the canard at a deflection angle while trimming the aircraft with the wing and the stabilator at the desired lift coefficient. The lift and the pitching moment coefficients are computed using the Missile DATCOM v. 707. The flow field around the wing-canard- stabilator body model is resolved using the OVERFLOW-2 flow solver. Overset/ chimera grid topology is used to simplify the grid generation of various configurations representing different streamwise lift distributions. The numerical simulations are performed without viscosity unless it is required for numerical stability. All configurations are simulated at Mach 1.4, angle-of-attack of 1.50, lift coefficient of 0.05, and pitching moment coefficient of approximately 0. Four streamwise lift distribution configurations were tested.

  15. Empirical analysis on the connection between power-law distributions and allometries for urban indicators

    NASA Astrophysics Data System (ADS)

    Alves, L. G. A.; Ribeiro, H. V.; Lenzi, E. K.; Mendes, R. S.

    2014-09-01

    We report on the existing connection between power-law distributions and allometries. As it was first reported in Gomez-Lievano et al. (2012) for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labor, illiteracy, income, sanitation and unemployment). Our analysis reveals that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residuals fluctuations surrounding the allometries are characterized by an almost constant variance and log-normal distributions.

  16. Quantitative Analysis of Subcellular Distribution of the SUMO Conjugation System by Confocal Microscopy Imaging.

    PubMed

    Mas, Abraham; Amenós, Montse; Lois, L Maria

    2016-01-01

    Different studies point to an enrichment in SUMO conjugation in the cell nucleus, although non-nuclear SUMO targets also exist. In general, the study of subcellular localization of proteins is essential for understanding their function within a cell. Fluorescence microscopy is a powerful tool for studying subcellular protein partitioning in living cells, since fluorescent proteins can be fused to proteins of interest to determine their localization. Subcellular distribution of proteins can be influenced by binding to other biomolecules and by posttranslational modifications. Sometimes these changes affect only a portion of the protein pool or have a partial effect, and a quantitative evaluation of fluorescence images is required to identify protein redistribution among subcellular compartments. In order to obtain accurate data about the relative subcellular distribution of SUMO conjugation machinery members, and to identify the molecular determinants involved in their localization, we have applied quantitative confocal microscopy imaging. In this chapter, we will describe the fluorescent protein fusions used in these experiments, and how to measure, evaluate, and compare average fluorescence intensities in cellular compartments by image-based analysis. We show the distribution of some components of the Arabidopsis SUMOylation machinery in epidermal onion cells and how they change their distribution in the presence of interacting partners or even when its activity is affected. PMID:27424751

  17. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    PubMed

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well.

  18. Pore space analysis of NAPL distribution in sand-clay media

    USGS Publications Warehouse

    Matmon, D.; Hayden, N.J.

    2003-01-01

    This paper introduces a conceptual model of clays and non-aqueous phase liquids (NAPLs) at the pore scale that has been developed from a mathematical unit cell model, and direct micromodel observation and measurement of clay-containing porous media. The mathematical model uses a unit cell concept with uniform spherical grains for simulating the sand in the sand-clay matrix (???10% clay). Micromodels made with glass slides and including different clay-containing porous media were used to investigate the two clays (kaolinite and montmorillonite) and NAPL distribution within the pore space. The results were used to understand the distribution of NAPL advancing into initially saturated sand and sand-clay media, and provided a detailed analysis of the pore-scale geometry, pore size distribution, NAPL entry pressures, and the effect of clay on this geometry. Interesting NAPL saturation profiles were observed as a result of the complexity of the pore space geometry with the different packing angles and the presence of clays. The unit cell approach has applications for enhancing the mechanistic understanding and conceptualization, both visually and mathematically, of pore-scale processes such as NAPL and clay distribution. ?? 2003 Elsevier Science Ltd. All rights reserved.

  19. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  20. Disentangling subpopulations in single-molecule FRET and ALEX experiments with photon distribution analysis.

    PubMed

    Tomov, Toma E; Tsukanov, Roman; Masoud, Rula; Liber, Miran; Plavner, Noa; Nir, Eyal

    2012-03-01

    Among the advantages of the single-molecule approach when used to study biomolecular structural dynamics and interaction is its ability to distinguish between and independently observe minor subpopulations. In a single-molecule Förster resonance energy transfer (FRET) and alternating laser excitation diffusion experiment, the various populations are apparent in the resultant histograms. However, because histograms are calculated based on the per-burst mean FRET and stoichiometry ratio and not on the internal photon distribution, much of the acquired information is lost, thereby reducing the capabilities of the method. Here we suggest what to our knowledge is a novel statistical analysis tool that significantly enhances these capabilities, and we use it to identify and isolate static and dynamic subpopulations. Based on a kernel density estimator and a proper photon distribution analysis, for each individual burst, we calculate scores that reflect properties of interest. Specifically, we determine the FRET efficiency and brightness ratio distributions and use them to reveal 1), the underlying structure of a two-state DNA-hairpin and a DNA hairpin that is bound to DNA origami; 2), a minor doubly labeled dsDNA subpopulation concealed in a larger singly labeled dsDNA; and 3), functioning DNA origami motors concealed within a larger subpopulation of defective motors. Altogether, these findings demonstrate the usefulness of the proposed approach. The method was developed and tested using simulations, its rationality is described, and a computer algorithm is provided.

  1. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  2. Particle size distribution of brown and white rice during gastric digestion measured by image analysis.

    PubMed

    Bornhorst, Gail M; Kostlan, Kevin; Singh, R Paul

    2013-09-01

    The particle size distribution of foods during gastric digestion indicates the amount of physical breakdown that occurred due to the peristaltic movement of the stomach walls in addition to the breakdown that initially occurred during oral processing. The objective of this study was to present an image analysis technique that was rapid, simple, and could distinguish between food components (that is, rice kernel and bran layer in brown rice). The technique was used to quantify particle breakdown of brown and white rice during gastric digestion in growing pigs (used as a model for an adult human) over 480 min of digestion. The particle area distributions were fit to a Rosin-Rammler distribution function. Brown and white rice exhibited considerable breakdown as the number of particles per image decreased over time. The median particle area (x(50)) increased during digestion, suggesting a gastric sieving phenomenon, where small particles were emptied and larger particles were retained for additional breakdown. Brown rice breakdown was further quantified by an examination of the bran layer fragments and rice grain pieces. The percentage of total particle area composed of bran layer fragments was greater in the distal stomach than the proximal stomach in the first 120 min of digestion. The results of this study showed that image analysis may be used to quantify particle breakdown of a soft food product during gastric digestion, discriminate between different food components, and help to clarify the role of food structure and processing in food breakdown during gastric digestion.

  3. Measuring arbitrary diffusion coefficient distributions of nano-objects by taylor dispersion analysis.

    PubMed

    Cipelletti, Luca; Biron, Jean-Philippe; Martin, Michel; Cottet, Hervé

    2015-08-18

    Taylor dispersion analysis is an absolute and straightforward characterization method that allows determining the diffusion coefficient, or equivalently the hydrodynamic radius, from angstroms to submicron size range. In this work, we investigated the use of the Constrained Regularized Linear Inversion approach as a new data processing method to extract the probability density functions of the diffusion coefficient (or hydrodynamic radius) from experimental taylorgrams. This new approach can be applied to arbitrary polydisperse samples and gives access to the whole diffusion coefficient distributions, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method was successfully applied to both simulated and real experimental data for solutions of moderately polydisperse polymers and their binary and ternary mixtures. Distributions of diffusion coefficients obtained by this method were favorably compared with those derived from size exclusion chromatography. The influence of the noise of the simulated taylorgrams on the data processing is discussed. Finally, we discuss the ability of the method to correctly resolve bimodal distributions as a function of the relative separation between the two constituent species.

  4. Spatial sensitivity analysis of remote sensing snow cover fraction data in a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Berezowski, Tomasz; Chormański, Jarosław; Nossent, Jiri; Batelaan, Okke

    2014-05-01

    Distributed hydrological models enhance the analysis and explanation of environmental processes. As more spatial input data and time series become available, more analysis is required of the sensitivity of the data on the simulations. Most research so far focussed on the sensitivity of precipitation data in distributed hydrological models. However, these results can not be compared until a universal approach to quantify the sensitivity of a model to spatial data is available. The frequently tested and used remote sensing data for distributed models is snow cover. Snow cover fraction (SCF) remote sensing products are easily available from the internet, e.g. MODIS snow cover product MOD10A1 (daily snow cover fraction at 500m spatial resolution). In this work a spatial sensitivity analysis (SA) of remotely sensed SCF from MOD10A1 was conducted with the distributed WetSpa model. The aim is to investigate if the WetSpa model is differently subjected to SCF uncertainty in different areas of the model domain. The analysis was extended to look not only at SA quantities but also to relate them to the physical parameters and processes in the study area. The study area is the Biebrza River catchment, Poland, which is considered semi natural catchment and subject to a spring snow melt regime. Hydrological simulations are performed with the distributed WetSpa model, with a simulation period of 2 hydrological years. For the SA the Latin-Hypercube One-factor-At-a-Time (LH-OAT) algorithm is used, with a set of different response functions in regular 4 x 4 km grid. The results show that the spatial patterns of sensitivity can be easily interpreted by co-occurrence of different landscape features. Moreover, the spatial patterns of the SA results are related to the WetSpa spatial parameters and to different physical processes. Based on the study results, it is clear that spatial approach of SA can be performed with the proposed algorithm and the MOD10A1 SCF is spatially sensitive in

  5. Spatial Intensity Distribution Analysis Reveals Abnormal Oligomerization of Proteins in Single Cells.

    PubMed

    Godin, Antoine G; Rappaz, Benjamin; Potvin-Trottier, Laurent; Kennedy, Timothy E; De Koninck, Yves; Wiseman, Paul W

    2015-08-18

    Knowledge of membrane receptor organization is essential for understanding the initial steps in cell signaling and trafficking mechanisms, but quantitative analysis of receptor interactions at the single-cell level and in different cellular compartments has remained highly challenging. To achieve this, we apply a quantitative image analysis technique-spatial intensity distribution analysis (SpIDA)-that can measure fluorescent particle concentrations and oligomerization states within different subcellular compartments in live cells. An important technical challenge faced by fluorescence microscopy-based measurement of oligomerization is the fidelity of receptor labeling. In practice, imperfect labeling biases the distribution of oligomeric states measured within an aggregated system. We extend SpIDA to enable analysis of high-order oligomers from fluorescence microscopy images, by including a probability weighted correction algorithm for nonemitting labels. We demonstrated that this fraction of nonemitting probes could be estimated in single cells using SpIDA measurements on model systems with known oligomerization state. Previously, this artifact was measured using single-step photobleaching. This approach was validated using computer-simulated data and the imperfect labeling was quantified in cells with ion channels of known oligomer subunit count. It was then applied to quantify the oligomerization states in different cell compartments of the proteolipid protein (PLP) expressed in COS-7 cells. Expression of a mutant PLP linked to impaired trafficking resulted in the detection of PLP tetramers that persist in the endoplasmic reticulum, while no difference was measured at the membrane between the distributions of wild-type and mutated PLPs. Our results demonstrate that SpIDA allows measurement of protein oligomerization in different compartments of intact cells, even when fractional mislabeling occurs as well as photobleaching during the imaging process, and

  6. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control subsystem, volume 1

    NASA Technical Reports Server (NTRS)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 1671 failure modes analyzed, 9 single failures were determined to result in loss of crew or vehicle. Three single failures unique to intact abort were determined to result in possible loss of the crew or vehicle. A possible loss of mission could result if any of 136 single failures occurred. Six of the criticality 1/1 failures are in two rotary and two pushbutton switches that control External Tank and Solid Rocket Booster separation. The other 6 criticality 1/1 failures are fuses, one each per Aft Power Control Assembly (APCA) 4, 5, and 6 and one each per Forward Power Control Assembly (FPCA) 1, 2, and 3, that supply power to certain Main Propulsion System (MPS) valves and Forward Reaction Control System (RCS) circuits.

  7. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China.

    PubMed

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-03-01

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  8. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China

    PubMed Central

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-01-01

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  9. Spatio-Temporal Distribution Characteristics and Trajectory Similarity Analysis of Tuberculosis in Beijing, China.

    PubMed

    Li, Lan; Xi, Yuliang; Ren, Fu

    2016-03-07

    Tuberculosis (TB) is an infectious disease with one of the highest reported incidences in China. The detection of the spatio-temporal distribution characteristics of TB is indicative of its prevention and control conditions. Trajectory similarity analysis detects variations and loopholes in prevention and provides urban public health officials and related decision makers more information for the allocation of public health resources and the formulation of prioritized health-related policies. This study analysed the spatio-temporal distribution characteristics of TB from 2009 to 2014 by utilizing spatial statistics, spatial autocorrelation analysis, and space-time scan statistics. Spatial statistics measured the TB incidence rate (TB patients per 100,000 residents) at the district level to determine its spatio-temporal distribution and to identify characteristics of change. Spatial autocorrelation analysis was used to detect global and local spatial autocorrelations across the study area. Purely spatial, purely temporal and space-time scan statistics were used to identify purely spatial, purely temporal and spatio-temporal clusters of TB at the district level. The other objective of this study was to compare the trajectory similarities between the incidence rates of TB and new smear-positive (NSP) TB patients in the resident population (NSPRP)/new smear-positive TB patients in the TB patient population (NSPTBP)/retreated smear-positive (RSP) TB patients in the resident population (RSPRP)/retreated smear-positive TB patients in the TB patient population (RSPTBP) to detect variations and loopholes in TB prevention and control among the districts in Beijing. The incidence rates in Beijing exhibited a gradual decrease from 2009 to 2014. Although global spatial autocorrelation was not detected overall across all of the districts of Beijing, individual districts did show evidence of local spatial autocorrelation: Chaoyang and Daxing were Low-Low districts over the six

  10. Iterative Monte Carlo analysis of spin-dependent parton distributions

    DOE PAGESBeta

    Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; Ethier, Jacob J.; Accardi, Alberto

    2016-04-05

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d2 moment of the nucleon within a global PDF analysis.« less

  11. A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil

    PubMed Central

    Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo

    2014-01-01

    This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909

  12. A Grid-based solution for management and analysis of microarrays in distributed experiments

    PubMed Central

    Porro, Ivan; Torterolo, Livia; Corradi, Luca; Fato, Marco; Papadimitropoulos, Adam; Scaglione, Silvia; Schenone, Andrea; Viti, Federica

    2007-01-01

    Several systems have been presented in the last years in order to manage the complexity of large microarray experiments. Although good results have been achieved, most systems tend to lack in one or more fields. A Grid based approach may provide a shared, standardized and reliable solution for storage and analysis of biological data, in order to maximize the results of experimental efforts. A Grid framework has been therefore adopted due to the necessity of remotely accessing large amounts of distributed data as well as to scale computational performances for terabyte datasets. Two different biological studies have been planned in order to highlight the benefits that can emerge from our Grid based platform. The described environment relies on storage services and computational services provided by the gLite Grid middleware. The Grid environment is also able to exploit the added value of metadata in order to let users better classify and search experiments. A state-of-art Grid portal has been implemented in order to hide the complexity of framework from end users and to make them able to easily access available services and data. The functional architecture of the portal is described. As a first test of the system performances, a gene expression analysis has been performed on a dataset of Affymetrix GeneChip® Rat Expression Array RAE230A, from the ArrayExpress database. The sequence of analysis includes three steps: (i) group opening and image set uploading, (ii) normalization, and (iii) model based gene expression (based on PM/MM difference model). Two different Linux versions (sequential and parallel) of the dChip software have been developed to implement the analysis and have been tested on a cluster. From results, it emerges that the parallelization of the analysis process and the execution of parallel jobs on distributed computational resources actually improve the performances. Moreover, the Grid environment have been tested both against the possibility of

  13. Distributed and/or grid-oriented approach to BTeV data analysis

    SciTech Connect

    Joel N. Butler

    2002-12-23

    The BTeV collaboration will record approximately 2 petabytes of raw data per year. It plans to analyze this data using the distributed resources of the collaboration as well as dedicated resources, primarily residing in the very large BTeV trigger farm, and resources accessible through the developing world-wide data grid. The data analysis system is being designed from the very start with this approach in mind. In particular, we plan a fully disk-based data storage system with multiple copies of the data distributed across the collaboration to provide redundancy and to optimize access. We will also position ourself to take maximum advantage of shared systems, as well as dedicated systems, at our collaborating institutions.

  14. Analysis of the size, shape, and spatial distribution of microinclusions by neutron-activation autoradiography

    SciTech Connect

    Flitsiyan, E.S.; Romanovskii, A.V.; Gurvich, L.G.; Kist, A.A.

    1987-02-01

    The local concentration and spatial distribution of some elements in minerals, rocks, and ores can be determined by means of neutron-activation autoradiography. The local element concentration is measured in this method by placing an activated section of the rock to be analyzed, together with an irradiated standard, against a photographic emulsion which acts as a radiation detector. The photographic density of the exposed emulsion varies as a function of the tested element content in the part of the sample next to the detector. In order to assess the value of neutron-activation autoradiography in the analysis of element distribution, we considered the main factors affecting the production of selective autoradiographs, viz., resolution, detection limit, and optimal irradiation conditions, holding time, and exposure.

  15. Exposure models for the prior distribution in bayesian decision analysis for occupational hygiene decision making.

    PubMed

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

    2013-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

  16. Geographic Distribution of Leishmania Species in Ecuador Based on the Cytochrome B Gene Sequence Analysis

    PubMed Central

    Kato, Hirotomo; Gomez, Eduardo A.; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa

    2016-01-01

    A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas. PMID:27410039

  17. Simulation and analysis of an intermediate frequency (IF) distribution system with applications for Space Station

    NASA Technical Reports Server (NTRS)

    Costello, Thomas A.; Brandt, C. Maite

    1989-01-01

    Simulation and analysis results are described for a wideband fiber optic intermediate frequency distribution channel for a frequency division multiple access (FDMA) system where antenna equipment is remotely located from the signal processing equipment. The fiber optic distribution channel accommodates multiple signals received from a single antenna with differing power levels. Performance parameters addressed are intermodulation degradations, laser noise, and adjacent channel interference, as they impact the overall system design. Simulation results showed that the laser diode modulation level can be allowed to reach 100 percent without considerable degradation. The laser noise must be controlled as to provide a noise floor of less than -90 dBW/Hz. The fiber optic link increases the degradation due to power imbalance yet diminishes the effects of the transmit amplifier nonlinearity. Overall, optimal operation conditions can be found to yield a degradation level of about .1 dB caused by the fiber optic link.

  18. Preliminary analysis of the span-distributed-load concept for cargo aircraft design

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1975-01-01

    A simplified computer analysis of the span-distributed-load airplane (in which payload is placed within the wing structure) has shown that the span-distributed-load concept has high potential for application to future air cargo transport design. Significant increases in payload fraction over current wide-bodied freighters are shown for gross weights in excess of 0.5 Gg (1,000,000 lb). A cruise-matching calculation shows that the trend toward higher aspect ratio improves overall efficiency; that is, less thrust and fuel are required. The optimal aspect ratio probably is not determined by structural limitations. Terminal-area constraints and increasing design-payload density, however, tend to limit aspect ratio.

  19. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert; Koch, Steven

    1993-01-01

    The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and designed to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three-dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

  20. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert; Koch, Steven

    1992-01-01

    The objective of this proposal is to develop an integrated and distributed analysis and display software system which can be applied to all areas of the Earth System Science to study numerical model and earth observational data from storm to global scale. This system will be designed to be easy to use, portable, flexible and easily extensible and to adhere to current and emerging standards whenever possible. It will provide an environment for visualization of the massive amounts of data generated from satellites and other observational field measurements and from model simulations during or after their execution. Two- and three-dimensional animation will also be provided. This system will be based on a widely used software package from NASA called GEMPAK and prototype software for three dimensional interactive displays built at NCSA. The underlying foundation of the system will be a set of software libraries which can be distributed across a UNIX based supercomputer and workstations.

  1. EXERGY ANALYSIS OF THE CRYOGENIC HELIUM DISTRIBUTION SYSTEM FOR THE LARGE HADRON COLLIDER (LHC)

    SciTech Connect

    Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.

    2010-04-09

    The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

  2. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  3. Geographic Distribution of Leishmania Species in Ecuador Based on the Cytochrome B Gene Sequence Analysis.

    PubMed

    Kato, Hirotomo; Gomez, Eduardo A; Martini-Robles, Luiggi; Muzzio, Jenny; Velez, Lenin; Calvopiña, Manuel; Romero-Alvarez, Daniel; Mimori, Tatsuyuki; Uezato, Hiroshi; Hashiguchi, Yoshihisa

    2016-07-01

    A countrywide epidemiological study was performed to elucidate the current geographic distribution of causative species of cutaneous leishmaniasis (CL) in Ecuador by using FTA card-spotted samples and smear slides as DNA sources. Putative Leishmania in 165 samples collected from patients with CL in 16 provinces of Ecuador were examined at the species level based on the cytochrome b gene sequence analysis. Of these, 125 samples were successfully identified as Leishmania (Viannia) guyanensis, L. (V.) braziliensis, L. (V.) naiffi, L. (V.) lainsoni, and L. (Leishmania) mexicana. Two dominant species, L. (V.) guyanensis and L. (V.) braziliensis, were widely distributed in Pacific coast subtropical and Amazonian tropical areas, respectively. Recently reported L. (V.) naiffi and L. (V.) lainsoni were identified in Amazonian areas, and L. (L.) mexicana was identified in an Andean highland area. Importantly, the present study demonstrated that cases of L. (V.) braziliensis infection are increasing in Pacific coast areas. PMID:27410039

  4. Exposure models for the prior distribution in bayesian decision analysis for occupational hygiene decision making.

    PubMed

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

    2013-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure.

  5. Analysis and modeling of information flow and distributed expertise in space-related operations.

    PubMed

    Caldwell, Barrett S

    2005-01-01

    Evolving space operations requirements and mission planning for long-duration expeditions require detailed examinations and evaluations of information flow dynamics, knowledge-sharing processes, and information technology use in distributed expert networks. This paper describes the work conducted with flight controllers in the Mission Control Center (MCC) of NASA's Johnson Space Center. This MCC work describes the behavior of experts in a distributed supervisory coordination framework, which extends supervisory control/command and control models of human task performance. Findings from this work are helping to develop analysis techniques, information architectures, and system simulation capabilities for knowledge sharing in an expert community. These findings are being applied to improve knowledge-sharing processes applied to a research program in advanced life support for long-duration space flight. Additional simulation work is being developed to create interoperating modules of information flow and novice/expert behavior patterns. PMID:15835058

  6. Statistical Distribution of Inflation on Lava Flows: Analysis of Flow Surfaces on Earth and Mars

    NASA Technical Reports Server (NTRS)

    Glazel, L. S.; Anderson, S. W.; Stofan, E. R.; Baloga, S.

    2003-01-01

    The surface morphology of a lava flow results from processes that take place during the emplacement of the flow. Certain types of features, such as tumuli, lava rises and lava rise pits, are indicators of flow inflation or endogenous growth of a lava flow. Tumuli in particular have been identified as possible indicators of tube location, indicating that their distribution on the surface of a lava flow is a junction of the internal pathways of lava present during flow emplacement. However, the distribution of tumuli on lava flows has not been examined in a statistically thorough manner. In order to more rigorously examine the distribution of tumuli on a lava flow, we examined a discrete flow lobe with numerous lava rises and tumuli on the 1969 - 1974 Mauna Ulu flow at Kilauea, Hawaii. The lobe is located in the distal portion of the flow below Holei Pali, which is characterized by hummocky pahoehoe flows emplaced from tubes. We chose this flow due to its discrete nature allowing complete mapping of surface morphologies, well-defined boundaries, well-constrained emplacement parameters, and known flow thicknesses. In addition, tube locations for this Mauna Ulu flow were mapped by Holcomb (1976) during flow emplacement. We also examine the distribution of tumuli on the distal portion of the hummocky Thrainsskjoldur flow field provided by Rossi and Gudmundsson (1996). Analysis of the Mauna Ulu and Thrainsskjoldur flow lobes and the availability of high-resolution MOC images motivated us to look for possible tumuli-dominated flow lobes on the surface of Mars. We identified a MOC image of a lava flow south of Elysium Mons with features morphologically similar to tumuli. The flow is characterized by raised elliptical to circular mounds, some with axial cracks, that are similar in size to the tumuli measured on Earth. One potential avenue of determining whether they are tumuli is to look at the spatial distribution to see if any patterns similar to those of tumuli

  7. Using occlusal wear information and finite element analysis to investigate stress distributions in human molars

    PubMed Central

    Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

    2011-01-01

    Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M1). The antagonistic crowns M1 and P2–M1 of two dried modern human skulls were scanned by μCT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M1 and P2–M1 was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M1 in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth. PMID:21615398

  8. Using occlusal wear information and finite element analysis to investigate stress distributions in human molars.

    PubMed

    Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

    2011-09-01

    Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M(1)). The antagonistic crowns M(1) and P(2)-M(1) of two dried modern human skulls were scanned by μCT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M(1) and P(2)-M(1) was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M(1) in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth.

  9. A statistical analysis of North East Atlantic (submicron) aerosol size distributions

    NASA Astrophysics Data System (ADS)

    Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.

    2011-12-01

    The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6% of the time), background clean marine category (occurring 26.1% of the time) and anthropogenic category (occurring 20% of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation), albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE) Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%), this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.

  10. Characterizing the distribution of an endangered salmonid using environmental DNA analysis

    USGS Publications Warehouse

    Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.

    2015-01-01

    Determining species distributions accurately is crucial to developing conservation and management strategies for imperiled species, but a challenging task for small populations. We evaluated the efficacy of environmental DNA (eDNA) analysis for improving detection and thus potentially refining the known distribution of Chinook salmon (Oncorhynchus tshawytscha) in the Methow and Okanogan Subbasins of the Upper Columbia River, which span the border between Washington, USA and British Columbia, Canada. We developed an assay to target a 90 base pair sequence of Chinook DNA and used quantitative polymerase chain reaction (qPCR) to quantify the amount of Chinook eDNA in triplicate 1-L water samples collected at 48 stream locations in June and again in August 2012. The overall probability of detecting Chinook with our eDNA method in areas within the known distribution was 0.77 (±0.05 SE). Detection probability was lower in June (0.62, ±0.08 SE) during high flows and at the beginning of spring Chinook migration than during base flows in August (0.93, ±0.04 SE). In the Methow subbasin, mean eDNA concentration was higher in August compared to June, especially in smaller tributaries, probably resulting from the arrival of spring Chinook adults, reduced discharge, or both. Chinook eDNA concentrations did not appear to change in the Okanogan subbasin from June to August. Contrary to our expectations about downstream eDNA accumulation, Chinook eDNA did not decrease in concentration in upstream reaches (0–120 km). Further examination of factors influencing spatial distribution of eDNA in lotic systems may allow for greater inference of local population densities along stream networks or watersheds. These results demonstrate the potential effectiveness of eDNA detection methods for determining landscape-level distribution of anadromous salmonids in large river systems.

  11. Analysis of facial sebum distribution using a digital fluorescent imaging system.

    PubMed

    Han, Byungkwan; Jung, Byungjo; Nelson, J Stuart; Choi, Eung-Ho

    2007-01-01

    Current methods for analysis of sebum excretion have limitations, such as irreproducible results in repeatable measurements due to the point measurement method, user-dependent artifacts due to contact measurement or qualitative evaluation of the image, and long measurement time. A UV-induced fluorescent digital imaging system is developed to acquire facial images so that the distribution of sebum excretion on the face could be analyzed. The imaging system consists of a constant UV-A light source, digital color camera, and head-positioning device. The system for acquisition of a fluorescent facial image and the image analysis method is described. The imaging modality provides uniform light distribution and presents a discernible color fluorescent image. Valuable parameters of sebum excretion are obtained after image analysis. The imaging system, which provides a noncontact method, is proved to be a useful tool to evaluate the amount and pattern of sebum excretion. When compared to conventional "Wood's lamp" and "Sebutape" methods that provide similar parameters for sebum excretion, the described method is simpler and more reliable to evaluate the dynamics of sebum excretion in nearly real-time. PMID:17343481

  12. Poster — Thur Eve — 74: Distributed, asynchronous, reactive dosimetric and outcomes analysis using DICOMautomaton

    SciTech Connect

    Clark, Haley; Wu, Jonn; Moiseenko, Vitali; Thomas, Steven

    2014-08-15

    Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. We describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.

  13. Assessment of Forest Conservation Value Using a Species Distribution Model and Object-based Image Analysis

    NASA Astrophysics Data System (ADS)

    Jin, Y.; Lee, D. K.; Jeong, S. G.

    2015-12-01

    The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.

  14. ImageJ analysis of dentin tubule distribution in human teeth.

    PubMed

    Williams, Casia; Wu, Yiching; Bowers, Doria F

    2015-08-01

    Mapping the distribution of dentin tubules is vital to understanding the structure-function relationship of dentin, an important indicator of tooth stability. This study compared the distances between and density of tubules in the external dentin located in the crown region of an adult human incisor and molar to determine if analysis could be conducted using light-level microscopy. Teeth were processed for routine histology, cut in cross-section, images captured using Advanced SPOT Program, and microstructure was analyzed using ImageJ (NIH). Intratubular (peritubular) dentin with or without odontoblast processes were observed and although incisor and molar images appeared visually similar, plot profile graphs differed. Distance-intervals between tubules in the incisor (5.45-7.67 μm) had an overall range of 2.22 μm and in the molar (7.43-8.42 μm) an overall range of 0.99 μm. While molar tubule distribution displayed a tighter overall range, there was a smaller distance between most incisor tubules. The average densities observed in incisors were 15,500 tubules/mm(2), compared with 20,100 tubules/mm(2) in molars. ImageJ analysis of prepared histology microscopic slides provides researchers with a rapid, inexpensive assessment tool when compared with advanced/ultrastructural methodologies. By combining routine histological processing and light microscopic observations followed by ImageJ analysis, tooth structure can be converted into numerical data and easily mastered by laboratory personnel.

  15. A new algorithm for importance analysis of the inputs with distribution parameter uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Luyi; Lu, Zhenzhou

    2016-10-01

    Importance analysis is aimed at finding the contributions by the inputs to the uncertainty in a model output. For structural systems involving inputs with distribution parameter uncertainty, the contributions by the inputs to the output uncertainty are governed by both the variability and parameter uncertainty in their probability distributions. A natural and consistent way to arrive at importance analysis results in such cases would be a three-loop nested Monte Carlo (MC) sampling strategy, in which the parameters are sampled in the outer loop and the inputs are sampled in the inner nested double-loop. However, the computational effort of this procedure is often prohibitive for engineering problem. This paper, therefore, proposes a newly efficient algorithm for importance analysis of the inputs in the presence of parameter uncertainty. By introducing a 'surrogate sampling probability density function (SS-PDF)' and incorporating the single-loop MC theory into the computation, the proposed algorithm can reduce the original three-loop nested MC computation into a single-loop one in terms of model evaluation, which requires substantially less computational effort. Methods for choosing proper SS-PDF are also discussed in the paper. The efficiency and robustness of the proposed algorithm have been demonstrated by results of several examples.

  16. Local storage federation through XRootD architecture for interactive distributed analysis

    NASA Astrophysics Data System (ADS)

    Colamaria, F.; Colella, D.; Donvito, G.; Elia, D.; Franco, A.; Luparello, G.; Maggi, G.; Miniello, G.; Vallero, S.; Vino, G.

    2015-12-01

    A cloud-based Virtual Analysis Facility (VAF) for the ALICE experiment at the LHC has been deployed in Bari. Similar facilities are currently running in other Italian sites with the aim to create a federation of interoperating farms able to provide their computing resources for interactive distributed analysis. The use of cloud technology, along with elastic provisioning of computing resources as an alternative to the grid for running data intensive analyses, is the main challenge of these facilities. One of the crucial aspects of the user-driven analysis execution is the data access. A local storage facility has the disadvantage that the stored data can be accessed only locally, i.e. from within the single VAF. To overcome such a limitation a federated infrastructure, which provides full access to all the data belonging to the federation independently from the site where they are stored, has been set up. The federation architecture exploits both cloud computing and XRootD technologies, in order to provide a dynamic, easy-to-use and well performing solution for data handling. It should allow the users to store the files and efficiently retrieve the data, since it implements a dynamic distributed cache among many datacenters in Italy connected to one another through the high-bandwidth national network. Details on the preliminary architecture implementation and performance studies are discussed.

  17. Statistical analysis of factors affecting landslide distribution in the new Madrid seismic zone, Tennessee and Kentucky

    USGS Publications Warehouse

    Jibson, R.W.; Keefer, D.K.

    1989-01-01

    More than 220 large landslides along the bluffs bordering the Mississippi alluvial plain between Cairo, Ill., and Memphis, Tenn., are analyzed by discriminant analysis and multiple linear regression to determine the relative effects of slope height and steepness, stratigraphic variation, slope aspect, and proximity to the hypocenters of the 1811-12 New Madrid, Mo., earthquakes on the distribution of these landslides. Three types of landslides are analyzed: (1) old, coherent slumps and block slides, which have eroded and revegetated features and no active analogs in the area; (2) old earth flows, which are also eroded and revegetated; and (3) young rotational slumps, which are present only along near-river bluffs, and which are the only young, active landslides in the area. Discriminant analysis shows that only one characteristic differs significantly between bluffs with and without young rotational slumps: failed bluffs tend to have sand and clay at their base, which may render them more susceptible to fluvial erosion. Bluffs having old coherent slides are significantly higher, steeper, and closer to the hypocenters of the 1811-12 earthquakes than bluffs without these slides. Bluffs having old earth flows are likewise higher and closer to the earthquake hypocenters. Multiple regression analysis indicates that the distribution of young rotational slumps is affected most strongly by slope steepness: about one-third of the variation in the distribution is explained by variations in slope steepness. The distribution of old coherent slides and earth flows is affected most strongly by slope height, but the proximity to the hypocenters of the 1811-12 earthquakes also significantly affects the distribution. The results of the statistical analyses indicate that the only recently active landsliding in the area is along actively eroding river banks, where rotational slumps formed as bluffs are undercut by the river. The analyses further indicate that the old coherent slides

  18. Tissue characterization of skin ulcer for bacterial infection by multiple statistical analysis of echo amplitude envelope

    NASA Astrophysics Data System (ADS)

    Omura, Masaaki; Yoshida, Kenji; Kohta, Masushi; Kubo, Takabumi; Ishiguro, Toshimichi; Kobayashi, Kazuto; Hozumi, Naohiro; Yamaguchi, Tadashi

    2016-07-01

    To characterize skin ulcers for bacterial infection, quantitative ultrasound (QUS) parameters were estimated by the multiple statistical analysis of the echo amplitude envelope based on both Weibull and generalized gamma distributions and the ratio of mean to standard deviation of the echo amplitude envelope. Measurement objects were three rat models (noninfection, critical colonization, and infection models). Ultrasound data were acquired using a modified ultrasonic diagnosis system with a center frequency of 11 MHz. In parallel, histopathological images and two-dimensional map of speed of sound (SoS) were observed. It was possible to detect typical tissue characteristics such as infection by focusing on the relationship of QUS parameters and to indicate the characteristic differences that were consistent with the scatterer structure. Additionally, the histopathological characteristics and SoS of noninfected and infected tissues were matched to the characteristics of QUS parameters in each rat model.

  19. A FORTRAN program for multivariate survival analysis on the personal computer.

    PubMed

    Mulder, P G

    1988-01-01

    In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.

  20. Nanostructural analysis of water distribution in hydrated multicomponent gels using thermal analysis and NMR relaxometry.

    PubMed

    Codoni, Doroty; Belton, Peter; Qi, Sheng

    2015-06-01

    Highly complex, multicomponent gels and water-containing soft materials have varied applications in biomedical, pharmaceutical, and food sciences, but the characterization of these nanostructured materials is extremely challenging. The aim of this study was to use stearoyl macrogol-32 glycerides (Gelucire 50/13) gels containing seven different species of glycerides, PEG, and PEG-esters, as model, complex, multicomponent gels, to investigate the effect of water content on the micro- and nanoarchitecture of the gel interior. Thermal analysis and NMR relaxometry were used to probe the thermal and diffusional behavior of water molecules within the gel network. For the highly concentrated gels (low water content), the water activity was significantly lowered due to entrapment in the dense gel network. For the gels with intermediate water content, multiple populations of water molecules with different thermal responses and diffusion behavior were detected, indicating the presence of water in different microenvironments. This correlated with the network architecture of the freeze-dried gels observed using SEM. For the gels with high water content, increased quantities of water with similar diffusion characteristics as free water could be detected, indicating the presence of large water pockets in these gels. The results of this study provide new insights into structure of Gelucire gels, which have not been reported before because of the complexity of the material. They also demonstrate that the combination of thermal analysis and NMR relaxometry offers insights into the structure of soft materials not available by the use of each technique alone. However, we also note that in some instances the results of these measurements are overinterpreted and we suggest limitations of the methods that must be considered when using them.

  1. Flow distribution analysis on the cooling tube network of ITER thermal shield

    NASA Astrophysics Data System (ADS)

    Nam, Kwanwoo; Chung, Wooho; Noh, Chang Hyun; Kang, Dong Kwon; Kang, Kyoung-O.; Ahn, Hee Jae; Lee, Hyeon Gon

    2014-01-01

    Thermal shield (TS) is to be installed between the vacuum vessel or the cryostat and the magnets in ITER tokamak to reduce the thermal radiation load to the magnets operating at 4.2K. The TS is cooled by pressurized helium gas at the inlet temperature of 80K. The cooling tube is welded on the TS panel surface and the composed flow network of the TS cooling tubes is complex. The flow rate in each panel should be matched to the thermal design value for effective radiation shielding. This paper presents one dimensional analysis on the flow distribution of cooling tube network for the ITER TS. The hydraulic cooling tube network is modeled by an electrical analogy. Only the cooling tube on the TS surface and its connecting pipe from the manifold are considered in the analysis model. Considering the frictional factor and the local loss in the cooling tube, the hydraulic resistance is expressed as a linear function with respect to mass flow rate. Sub-circuits in the TS are analyzed separately because each circuit is controlled by its own control valve independently. It is found that flow rates in some panels are insufficient compared with the design values. In order to improve the flow distribution, two kinds of design modifications are proposed. The first one is to connect the tubes of the adjacent panels. This will increase the resistance of the tube on the panel where the flow rate is excessive. The other design suggestion is that an orifice is installed at the exit of tube routing where the flow rate is to be reduced. The analysis for the design suggestions shows that the flow mal-distribution is improved significantly.

  2. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1993-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scalable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded with this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on concept called the DataHub. With the DataHub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (address, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  3. Knowledge-based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Jacobson, Allan S.; Doyle, Richard J.; Collins, Donald J.

    1992-01-01

    Within this decade, the growth in complexity of exploratory data analysis and the sheer volume of space data require new and innovative approaches to support science investigators in achieving their research objectives. To date, there have been numerous efforts addressing the individual issues involved in inter-disciplinary, multi-instrument investigations. However, while successful in small scale, these efforts have not proven to be open and scaleable. This proposal addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within this proposal is the integration of three automation technologies, namely, knowledge-based expert systems, science visualization and science data management. This integration is based on the concept called the Data Hub. With the Data Hub concept, NASA will be able to apply a more complete solution to all nodes of a distributed system. Both computation nodes and interactive nodes will be able to effectively and efficiently use the data services (access, retrieval, update, etc.) with a distributed, interdisciplinary information system in a uniform and standard way. This will allow the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis will be on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to publishable scientific results. In addition, the proposed work includes all the required end-to-end components and interfaces to demonstrate the completed concept.

  4. Multiobjective sensitivity analysis and optimization of a distributed hydrologic model MOBIDIC

    NASA Astrophysics Data System (ADS)

    Yang, J.; Castelli, F.; Chen, Y.

    2014-03-01

    Calibration of distributed hydrologic models usually involves how to deal with the large number of distributed parameters and optimization problems with multiple but often conflicting objectives which arise in a natural fashion. This study presents a multiobjective sensitivity and optimization approach to handle these problems for a distributed hydrologic model MOBIDIC, which combines two sensitivity analysis techniques (Morris method and State Dependent Parameter method) with a multiobjective optimization (MOO) approach ϵ-NSGAII. This approach was implemented to calibrate MOBIDIC with its application to the Davidson watershed, North Carolina with three objective functions, i.e., standardized root mean square error of logarithmic transformed discharge, water balance index, and mean absolute error of logarithmic transformed flow duration curve, and its results were compared with those with a single objective optimization (SOO) with the traditional Nelder-Mead Simplex algorithm used in MOBIDIC by taking the objective function as the Euclidean norm of these three objectives. Results show: (1) the two sensitivity analysis techniques are effective and efficient to determine the sensitive processes and insensitive parameters: surface runoff and evaporation are very sensitive processes to all three objective functions, while groundwater recession and soil hydraulic conductivity are not sensitive and were excluded in the optimization; (2) both MOO and SOO lead to acceptable simulations, e.g., for MOO, average Nash-Sutcliffe is 0.75 in the calibration period and 0.70 in the validation period; (3) evaporation and surface runoff shows similar importance to watershed water balance while the contribution of baseflow can be ignored; (4) compared to SOO which was dependent of initial starting location, MOO provides more insight on parameter sensitivity and conflicting characteristics of these objective functions. Multiobjective sensitivity analysis and optimization

  5. Flow distribution analysis on the cooling tube network of ITER thermal shield

    SciTech Connect

    Nam, Kwanwoo; Chung, Wooho; Noh, Chang Hyun; Kang, Dong Kwon; Kang, Kyoung-O; Ahn, Hee Jae; Lee, Hyeon Gon

    2014-01-29

    Thermal shield (TS) is to be installed between the vacuum vessel or the cryostat and the magnets in ITER tokamak to reduce the thermal radiation load to the magnets operating at 4.2K. The TS is cooled by pressurized helium gas at the inlet temperature of 80K. The cooling tube is welded on the TS panel surface and the composed flow network of the TS cooling tubes is complex. The flow rate in each panel should be matched to the thermal design value for effective radiation shielding. This paper presents one dimensional analysis on the flow distribution of cooling tube network for the ITER TS. The hydraulic cooling tube network is modeled by an electrical analogy. Only the cooling tube on the TS surface and its connecting pipe from the manifold are considered in the analysis model. Considering the frictional factor and the local loss in the cooling tube, the hydraulic resistance is expressed as a linear function with respect to mass flow rate. Sub-circuits in the TS are analyzed separately because each circuit is controlled by its own control valve independently. It is found that flow rates in some panels are insufficient compared with the design values. In order to improve the flow distribution, two kinds of design modifications are proposed. The first one is to connect the tubes of the adjacent panels. This will increase the resistance of the tube on the panel where the flow rate is excessive. The other design suggestion is that an orifice is installed at the exit of tube routing where the flow rate is to be reduced. The analysis for the design suggestions shows that the flow mal-distribution is improved significantly.

  6. Statistics analysis of distribution of Bradysia Ocellaris insect on Oyster mushroom cultivation

    NASA Astrophysics Data System (ADS)

    Sari, Kurnia Novita; Amelia, Ririn

    2015-12-01

    Bradysia Ocellaris insect is a pest on Oyster mushroom cultivation. The disitribution of Bradysia Ocellaris have a special pattern that can observed every week with several asumption such as independent, normality and homogenity. We can analyze the number of Bradysia Ocellaris for each week through descriptive analysis. Next, the distribution pattern of Bradysia Ocellaris is described through by semivariogram that is diagram of variance from difference value between pair of observation that separeted by d. Semivariogram model that suitable for Bradysia Ocellaris data is spherical isotropic model.

  7. Complete Distributed Hyper-Entangled-Bell-State Analysis and Quantum Super Dense Coding

    NASA Astrophysics Data System (ADS)

    Zheng, Chunhong; Gu, Yongjian; Li, Wendong; Wang, Zhaoming; Zhang, Jiying

    2016-02-01

    We propose a protocol to implement the distributed hyper-entangled-Bell-state analysis (HBSA) for photonic qubits with weak cross-Kerr nonlinearities, QND photon-number-resolving detection, and some linear optical elements. The distinct feature of our scheme is that the BSA for two different degrees of freedom can be implemented deterministically and nondestructively. Based on the present HBSA, we achieve quantum super dense coding with double information capacity, which makes our scheme more significant for long-distance quantum communication.

  8. Three-dimensional finite element analysis of the effects of posts on stress distribution in dentin.

    PubMed

    Ho, M H; Lee, S Y; Chen, H H; Lee, M C

    1994-10-01

    A finite element analysis was conducted to study the influence of posts on dentinal stress in pulpless teeth. Three-dimensional models of an intact Chinese maxillary central incisor with and without post restoration were analyzed. When the tooth was subjected to masticatory and traumatic loads, stress distributions in dentin were similar whether or not the post was present. Maximal dentinal stresses were reduced by only 7% to 10% and 10% to 14.5%, respectively, with gold alloy and stainless steel posts. Thus the reinforcement effects from posts appeared limited in pulpless incisors.

  9. [An EMD based time-frequency distribution and its application in EEG analysis].

    PubMed

    Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping

    2007-10-01

    Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.

  10. Prediction of product distribution in fine biomass pyrolysis in fluidized beds based on proximate analysis.

    PubMed

    Kim, Sung Won

    2015-01-01

    A predictive model was satisfactorily developed to describe the general trends of product distribution in fluidized beds of lignocellulosic biomass pyrolysis. The model was made of mass balance based on proximate analysis and an empirical relationship with operating parameters including fluidization hydrodynamics. The empirical relationships between product yields and fluidization conditions in fluidized bed pyrolyzers were derived from the data of this study and literature. The gas and char yields showed strong functions of temperature and vapor residence time in the pyrolyzer. The yields showed a good correlation with fluidization variables related with hydrodynamics and bed mixing. The predicted product yields based on the model well accorded well with the experimental data.

  11. Constraints on spin-dependent parton distributions at large x from global QCD analysis

    NASA Astrophysics Data System (ADS)

    Jimenez-Delgado, P.; Avakian, H.; Melnitchouk, W.

    2014-11-01

    We investigate the behavior of spin-dependent parton distribution functions (PDFs) at large parton momentum fractions x in the context of global QCD analysis. We explore the constraints from existing deep-inelastic scattering data, and from theoretical expectations for the leading x → 1 behavior based on hard gluon exchange in perturbative QCD. Systematic uncertainties from the dependence of the PDFs on the choice of parametrization are studied by considering functional forms motivated by orbital angular momentum arguments. Finally, we quantify the reduction in the PDF uncertainties that may be expected from future high-x data from Jefferson Lab at 12 GeV.

  12. Gibbs distribution analysis of temporal correlations structure in retina ganglion cells

    PubMed Central

    Vasquez, J. C.; Marre, O.; Palacios, A.G.; Berry, M.J.; Cessac, B.

    2012-01-01

    We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (Marre et al. (2009)) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions. PMID:22115900

  13. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    PubMed

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (p<0.05) of ZZ (η=920.4) was higher than the ZCA (η=651.1) and similar to the ZMAX (η=983.6) and ZYZ (η=1054.8). By means of quantitative and qualitative fractographic analysis, this study showed fracture toughness and strength that could be correlated to the observable microstructural features of the evaluated zirconia polycrystalline ceramics. PMID:26722988

  14. CRAB3: Establishing a new generation of services for distributed analysis at CMS

    NASA Astrophysics Data System (ADS)

    Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.

    2012-12-01

    In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.

  15. Rainfall extremes: Toward reconciliation after the battle of distributions

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.

    2014-01-01

    This study attempts to reconcile the conflicting results reported in the literature concerning the behavior of peak-over-threshold (POT) daily rainfall extremes and their distribution. By using two worldwide data sets, the impact of threshold selection and record length on the upper tail behavior of POT observations is investigated. The rainfall process is studied within the framework of generalized Pareto (GP) exceedances according to the classical extreme value theory (EVT), with particular attention paid to the study of the GP shape parameter, which controls the heaviness of the upper tail of the GP distribution. A twofold effect is recognized. First, as the threshold decreases, and nonextreme values are progressively incorporated in the POT samples, the variance of the GP shape parameter reduces and the mean converges to positive values denoting a tendency to heavy tail behavior. Simultaneously, the EVT asymptotic hypotheses are less and less realistic, and the GP asymptote tends to be replaced by the Weibull penultimate asymptote whose upper tail is exponential but apparently heavy. Second, for a fixed high threshold, the variance of the GP shape parameter reduces as the record length (number of years) increases, and the mean values tend to be positive, thus denoting again the prevalence of heavy tail behavior. In both cases, i.e., threshold selection and record length effect, the heaviness of the tail may be ascribed to mechanisms such as the blend of extreme and nonextreme values, and fluctuations of the parent distributions. It is shown how these results provide a link between previous studies and pave the way for more comprehensive analyses which merge empirical, theoretical, and operational points of view. This study also provides several ancillary results, such as a set of formulae to correct the bias of the GP shape parameter estimates due to short record lengths accounting for uncertainty, thus avoiding systematic underestimation of extremes which

  16. Quantitative high-pressure pair distribution function analysis of nanocrystalline gold

    NASA Astrophysics Data System (ADS)

    Martin, C. David; Antao, Sytle M.; Chupas, Peter J.; Lee, Peter L.; Shastri, Sarvjit D.; Parise, John B.

    2005-02-01

    Using a diamond anvil cell with high-energy monochromatic x rays, we have studied the total scattering of nanocrystalline gold to 20Å-1 at pressures up to 10GPa in a hydrostatic alcohol pressure-medium. Through direct Fourier transformation of the structure function [S(Q)], pair distribution functions (PDFs) [G(r)] are calculated without Kaplow-type iterative corrections. Quantitative high-pressure PDF (QHP-PDF) analysis is performed via full-profile least-squares modeling and confirmed through comparison of Rietveld analysis of Bragg diffraction. The quality of the high pressure PDFs obtained demonstrates the integrity of our technique and suggests the feasibility of future QHP-PDF studies of liquids, disordered solids, and materials at phase transition under pressure.

  17. New limits on intrinsic charm in the nucleon from global analysis of parton distributions.

    PubMed

    Jimenez-Delgado, P; Hobbs, T J; Londergan, J T; Melnitchouk, W

    2015-02-27

    We present a new global QCD analysis of parton distribution functions, allowing for possible intrinsic charm (IC) contributions in the nucleon inspired by light-front models. The analysis makes use of the full range of available high-energy scattering data for Q^{2}≳1  GeV^{2} and W^{2}≳3.5  GeV^{2}, including fixed-target proton and deuteron cross sections at lower energies that were excluded in previous global analyses. The expanded data set places more stringent constraints on the momentum carried by IC, with ⟨x⟩_{IC} at most 0.5% (corresponding to an IC normalization of ∼1%) at the 4σ level for Δχ^{2}=1. We also critically assess the impact of older EMC measurements of F_{2}^{c} at large x, which favor a nonzero IC, but with very large χ^{2} values. PMID:25768757

  18. Pore size distribution analysis of activated carbons prepared from coconut shell using methane adsorption data

    NASA Astrophysics Data System (ADS)

    Ahmadpour, A.; Okhovat, A.; Darabi Mahboub, M. J.

    2013-06-01

    The application of Stoeckli theory to determine pore size distribution (PSD) of activated carbons using high pressure methane adsorption data is explored. Coconut shell was used as a raw material for the preparation of 16 different activated carbon samples. Four samples with higher methane adsorption were selected and nitrogen adsorption on these adsorbents was also investigated. Some differences are found between the PSD obtained from the analysis of nitrogen adsorption isotherms and their PSD resulting from the same analysis using methane adsorption data. It is suggested that these differences may arise from the specific interactions between nitrogen molecules and activated carbon surfaces; therefore caution is required in the interpretation of PSD obtained from the nitrogen isotherm data.

  19. New Limits on Intrinsic Charm in the Nucleon from Global Analysis of Parton Distributions

    NASA Astrophysics Data System (ADS)

    Jimenez-Delgado, P.; Hobbs, T. J.; Londergan, J. T.; Melnitchouk, W.

    2015-02-01

    We present a new global QCD analysis of parton distribution functions, allowing for possible intrinsic charm (IC) contributions in the nucleon inspired by light-front models. The analysis makes use of the full range of available high-energy scattering data for Q2≳1 GeV2 and W2≳3.5 GeV2 , including fixed-target proton and deuteron cross sections at lower energies that were excluded in previous global analyses. The expanded data set places more stringent constraints on the momentum carried by IC, with ⟨x ⟩IC at most 0.5% (corresponding to an IC normalization of ˜1 % ) at the 4 σ level for Δ χ2=1 . We also critically assess the impact of older EMC measurements of F2c at large x , which favor a nonzero IC, but with very large χ2 values.

  20. New limits on intrinsic charm in the nucleon from global analysis of parton distributions

    DOE PAGESBeta

    Jimenez-Delgado, P.; Hobbs, T. J.; Londergan, J. T.; Melnitchouk, W.

    2015-02-27

    We present a new global QCD analysis of parton distribution functions, allowing for possible intrinsic charm (IC) contributions in the nucleon inspired by light-front models. The analysis makes use of the full range of available high-energy scattering data for Q2 ≥ 1 GeV2 and W2 ≥ 3.5 GeV2, including fixed-target proton and deuteron deep cross sections at lower energies that were excluded in previously global analyses. The expanded data set places more stringent constraints on the momentum carried by IC, with (x)IC at most 0.5% (corresponding to an IC normalization of ~1%) at the 4σ level for ΔX2 = 1.more » We also assess the impact of older EMC measurements of Fc2c at large x, which favor a nonzero IC, but with very large X2 values.« less