NASA Technical Reports Server (NTRS)
Lanzi, R. James; Vincent, Brett T.
1993-01-01
The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.
NASA Astrophysics Data System (ADS)
Zorila, Alexandru; Stratan, Aurel; Nemes, George
2018-01-01
We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.
Nematode Damage Functions: The Problems of Experimental and Sampling Error
Ferris, H.
1984-01-01
The development and use of pest damage functions involves measurement and experimental errors associated with cultural, environmental, and distributional factors. Damage predictions are more valuable if considered with associated probability. Collapsing population densities into a geometric series of population classes allows a pseudo-replication removal of experimental and sampling error in damage function development. Recognition of the nature of sampling error for aggregated populations allows assessment of probability associated with the population estimate. The product of the probabilities incorporated in the damage function and in the population estimate provides a basis for risk analysis of the yield loss prediction and the ensuing management decision. PMID:19295865
Tornado damage risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reinhold, T.A.; Ellingwood, B.
1982-09-01
Several proposed models were evaluated for predicting tornado wind speed probabilities at nuclear plant sites as part of a program to develop statistical data on tornadoes needed for probability-based load combination analysis. A unified model was developed which synthesized the desired aspects of tornado occurrence and damage potential. The sensitivity of wind speed probability estimates to various tornado modeling assumptions are examined, and the probability distributions of tornado wind speed that are needed for load combination studies are presented.
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach
NASA Technical Reports Server (NTRS)
Warner, James E.; Hochhalter, Jacob D.
2016-01-01
This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.
Interacting damage models mapped onto ising and percolation models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toussaint, Renaud; Pride, Steven R.
The authors introduce a class of damage models on regular lattices with isotropic interactions between the broken cells of the lattice. Quasistatic fiber bundles are an example. The interactions are assumed to be weak, in the sense that the stress perturbation from a broken cell is much smaller than the mean stress in the system. The system starts intact with a surface-energy threshold required to break any cell sampled from an uncorrelated quenched-disorder distribution. The evolution of this heterogeneous system is ruled by Griffith's principle which states that a cell breaks when the release in potential (elastic) energy in themore » system exceeds the surface-energy barrier necessary to break the cell. By direct integration over all possible realizations of the quenched disorder, they obtain the probability distribution of each damage configuration at any level of the imposed external deformation. They demonstrate an isomorphism between the distributions so obtained and standard generalized Ising models, in which the coupling constants and effective temperature in the Ising model are functions of the nature of the quenched-disorder distribution and the extent of accumulated damage. In particular, they show that damage models with global load sharing are isomorphic to standard percolation theory, that damage models with local load sharing rule are isomorphic to the standard ising model, and draw consequences thereof for the universality class and behavior of the autocorrelation length of the breakdown transitions corresponding to these models. they also treat damage models having more general power-law interactions, and classify the breakdown process as a function of the power-law interaction exponent. Last, they also show that the probability distribution over configurations is a maximum of Shannon's entropy under some specific constraints related to the energetic balance of the fracture process, which firmly relates this type of quenched-disorder based damage model to standard statistical mechanics.« less
Laurence, Ted A; Bude, Jeff D; Ly, Sonny; Shen, Nan; Feit, Michael D
2012-05-07
Surface laser damage limits the lifetime of optics for systems guiding high fluence pulses, particularly damage in silica optics used for inertial confinement fusion-class lasers (nanosecond-scale high energy pulses at 355 nm/3.5 eV). The density of damage precursors at low fluence has been measured using large beams (1-3 cm); higher fluences cannot be measured easily since the high density of resulting damage initiation sites results in clustering. We developed automated experiments and analysis that allow us to damage test thousands of sites with small beams (10-30 µm), and automatically image the test sites to determine if laser damage occurred. We developed an analysis method that provides a rigorous connection between these small beam damage test results of damage probability versus laser pulse energy and the large beam damage results of damage precursor densities versus fluence. We find that for uncoated and coated fused silica samples, the distribution of precursors nearly flattens at very high fluences, up to 150 J/cm2, providing important constraints on the physical distribution and nature of these precursors.
Estimating the probability of mountain pine beetle red-attack damage
Michael A Wulder; J. C. White; Barbara J Bentz; M. F. Alvarez; N. C. Coops
2006-01-01
Accurate spatial information on the location and extent of mountain pine beetle infestation is critical for the planning of mitigation and treatment activities. Areas of mixed forest and variable terrain present unique challenges for the detection and mapping of mountain pine beetle red-attack damage, as red-attack has a more heterogeneous distribution under these...
Kendal, W S
2000-04-01
To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.
Flood Damages- savings potential for Austrian municipalities and evidence of adaptation
NASA Astrophysics Data System (ADS)
Unterberger, C.
2016-12-01
Recent studies show that the number of extreme precipitation events has increased globally and will continue to do so in the future. These observations are particularly true for central, northern and north-eastern Europe. These changes in the patterns of extreme events have direct repercussions for policy makers. Rojas et al. (2013) find that until 2080, annual damages could increase by a factor of 17 (from €5,5 bn/year today to € 98 bn/year in 2080) in the event that no adaptation measures are taken. Steininger et al. (2015) find that climate and weather induced extreme events account for an annual current welfare loss of about € 1 billion in Austria. As a result, policy makers will need to understand the interaction between hazard, exposure and vulnerability, with the goal of achieving flood risk reduction. Needed is a better understanding of where exposure, vulnerability and eventually flood risk are highest, i.e. where to reduce risk first and which factors drive existing flood risk. This article analyzes direct flood losses as reported by 1153 Austrian municipalities between 2005 and 2013. To achieve comparability between flood damages and municipalities' ordinary spending, a "vulnerability threshold" is introduced suggesting that flood damages should be lower than 5% of municipalities' average annual ordinary spending. It is found that the probability that flood damages exceed this vulnerability threshold is 12%. To provide a reliable estimate for that exceedance probability the joint distribution of damages and spending is modelled by means of a copula approach. Based on the joint distribution, a Monte Carlo simulation is conducted to derive uncertainty ranges for the exceedance probability. To analyze the drivers of flood damages and the effect they have on municipalities' spending, two linear regression models are estimated. Hereby obtained results suggest that damages increase significantly for those municipalities located along the shores of the river Danube and decrease significantly for municipalities that experienced floods in the past- indicating successful adaptation. As for the relationship between flood damages and municipalities' spending, the regression results indicate that flood damages have a significant positive impact.
Proposal of a method for evaluating tsunami risk using response-surface methodology
NASA Astrophysics Data System (ADS)
Fukutani, Y.
2017-12-01
Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.
Bayesian network representing system dynamics in risk analysis of nuclear systems
NASA Astrophysics Data System (ADS)
Varuttamaseni, Athi
2011-12-01
A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have calculated the core damage probably as a function of transient time. The use of the DBN model in combination with ACE allows risk analysis to be performed with much less effort than if the analysis were done using the standard techniques.
Surgical Treatment of Laser Induced Eye Injuries
1992-08-21
for public release; distribution unlimited 13. ABSTRACT (MdamnrUm 200 words) The project was carried out iii response to the increasing incidence of... response of the eye to the ini:ial damage. Clearly, little can be done about initial damage after the injurious event. However, it is quite probable...hemorrhage has been addressed by de Juan and Machemer (1988). These authors note similar progression of hemorrhage to fibrotic tissue, although the
NASA Astrophysics Data System (ADS)
Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao
2018-06-01
This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.
Evaluation damage threshold of optical thin-film using an amplified spontaneous emission source
NASA Astrophysics Data System (ADS)
Zhou, Qiong; Sun, Mingying; Zhang, Zhixiang; Yao, Yudong; Peng, Yujie; Liu, Dean; Zhu, Jianqiang
2014-10-01
An accurate evaluation method with an amplified spontaneous emission (ASE) as the irradiation source has been developed for testing thin-film damage threshold. The partial coherence of the ASE source results in a very smooth beam profile in the near-field and a uniform intensity distribution of the focal spot in the far-field. ASE is generated by an Nd: glass rod amplifier in SG-II high power laser facility, with pulse duration of 9 ns and spectral width (FWHM) of 1 nm. The damage threshold of the TiO2 high reflection film is 14.4J/cm2 using ASE as the irradiation source, about twice of 7.4 J/cm2 that tested by a laser source with the same pulse duration and central wavelength. The damage area induced by ASE is small with small-scale desquamation and a few pits, corresponding to the defect distribution of samples. Large area desquamation is observed in the area damaged by laser, as the main reason that the non-uniformity of the laser light. The ASE damage threshold leads to more accurate evaluations of the samples damage probability by reducing the influence of hot spots in the irradiation beam. Furthermore, the ASE source has a great potential in the detection of the defect distribution of the optical elements.
A novel multitarget model of radiation-induced cell killing based on the Gaussian distribution.
Zhao, Lei; Mi, Dong; Sun, Yeqing
2017-05-07
The multitarget version of the traditional target theory based on the Poisson distribution is still used to describe the dose-survival curves of cells after ionizing radiation in radiobiology and radiotherapy. However, noting that the usual ionizing radiation damage is the result of two sequential stochastic processes, the probability distribution of the damage number per cell should follow a compound Poisson distribution, like e.g. Neyman's distribution of type A (N. A.). In consideration of that the Gaussian distribution can be considered as the approximation of the N. A. in the case of high flux, a multitarget model based on the Gaussian distribution is proposed to describe the cell inactivation effects in low linear energy transfer (LET) radiation with high dose-rate. Theoretical analysis and experimental data fitting indicate that the present theory is superior to the traditional multitarget model and similar to the Linear - Quadratic (LQ) model in describing the biological effects of low-LET radiation with high dose-rate, and the parameter ratio in the present model can be used as an alternative indicator to reflect the radiation damage and radiosensitivity of the cells. Copyright © 2017 Elsevier Ltd. All rights reserved.
Flood Risk Due to Hurricane Flooding
NASA Astrophysics Data System (ADS)
Olivera, Francisco; Hsu, Chih-Hung; Irish, Jennifer
2015-04-01
In this study, we evaluated the expected economic losses caused by hurricane inundation. We used surge response functions, which are physics-based dimensionless scaling laws that give surge elevation as a function of the hurricane's parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location) at specified locations along the coast. These locations were close enough to avoid significant changes in surge elevations between consecutive points, and distant enough to minimize calculations. The probability of occurrence of a surge elevation value at a given location was estimated using a joint probability distribution of the hurricane parameters. The surge elevation, at the shoreline, was assumed to project horizontally inland within a polygon of influence. Individual parcel damage was calculated based on flood water depth and damage vs. depth curves available for different building types from the HAZUS computer application developed by the Federal Emergency Management Agency (FEMA). Parcel data, including property value and building type, were obtained from the county appraisal district offices. The expected economic losses were calculated as the sum of the products of the estimated parcel damages and their probability of occurrence for the different storms considered. Anticipated changes for future climate scenarios were considered by accounting for projected hurricane intensification, as indicated by sea surface temperature rise, and sea level rise, which modify the probability distribution of hurricane central pressure and change the baseline of the damage calculation, respectively. Maps of expected economic losses have been developed for Corpus Christi in Texas, Gulfport in Mississippi and Panama City in Florida. Specifically, for Port Aransas, in the Corpus Christi area, it was found that the expected economic losses were in the range of 1% to 4% of the property value for current climate conditions, of 1% to 8% for the 2030's and of 1% to 14% for the 2080's.
NASA Astrophysics Data System (ADS)
Le, Jia-Liang; Bažant, Zdeněk P.
2011-07-01
This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.
NASA Technical Reports Server (NTRS)
Nemeth, Noel
2013-01-01
Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software
Social tension as precursor of large damaging earthquake: legend or reality?
NASA Astrophysics Data System (ADS)
Molchanov, O.
2008-11-01
Using case study of earthquake (EQ) activity and war conflicts in Caucasus during 1975 2002 time interval and correlation analysis of global distribution of damaging EQs and war-related social tension during 1901 2005 period we conclude:
The Effect of General Statistical Fiber Misalignment on Predicted Damage Initiation in Composites
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Aboudi, Jacob; Arnold, Steven M.
2014-01-01
A micromechanical method is employed for the prediction of unidirectional composites in which the fiber orientation can possess various statistical misalignment distributions. The method relies on the probability-weighted averaging of the appropriate concentration tensor, which is established by the micromechanical procedure. This approach provides access to the local field quantities throughout the constituents, from which initiation of damage in the composite can be predicted. In contrast, a typical macromechanical procedure can determine the effective composite elastic properties in the presence of statistical fiber misalignment, but cannot provide the local fields. Fully random fiber distribution is presented as a special case using the proposed micromechanical method. Results are given that illustrate the effects of various amounts of fiber misalignment in terms of the standard deviations of in-plane and out-of-plane misalignment angles, where normal distributions have been employed. Damage initiation envelopes, local fields, effective moduli, and strengths are predicted for polymer and ceramic matrix composites with given normal distributions of misalignment angles, as well as fully random fiber orientation.
Differential effects of insular and ventromedial prefrontal cortex lesions on risky decision-making
Bechara, A.; Damasio, H.; Aitken, M. R. F.; Sahakian, B. J.; Robbins, T. W.
2008-01-01
The ventromedial prefrontal cortex (vmPFC) and insular cortex are implicated in distributed neural circuitry that supports emotional decision-making. Previous studies of patients with vmPFC lesions have focused primarily on decision-making under uncertainty, when outcome probabilities are ambiguous (e.g. the Iowa Gambling Task). It remains unclear whether vmPFC is also necessary for decision-making under risk, when outcome probabilities are explicit. It is not known whether the effect of insular damage is analogous to the effect of vmPFC damage, or whether these regions contribute differentially to choice behaviour. Four groups of participants were compared on the Cambridge Gamble Task, a well-characterized measure of risky decision-making where outcome probabilities are presented explicitly, thus minimizing additional learning and working memory demands. Patients with focal, stable lesions to the vmPFC (n = 20) and the insular cortex (n = 13) were compared against healthy subjects (n = 41) and a group of lesion controls (n = 12) with damage predominantly affecting the dorsal and lateral frontal cortex. The vmPFC and insular cortex patients showed selective and distinctive disruptions of betting behaviour. VmPFC damage was associated with increased betting regardless of the odds of winning, consistent with a role of vmPFC in biasing healthy individuals towards conservative options under risk. In contrast, patients with insular cortex lesions failed to adjust their bets by the odds of winning, consistent with a role of the insular cortex in signalling the probability of aversive outcomes. The insular group attained a lower point score on the task and experienced more ‘bankruptcies’. There were no group differences in probability judgement. These data confirm the necessary role of the vmPFC and insular regions in decision-making under risk. Poor decision-making in clinical populations can arise via multiple routes, with functionally dissociable effects of vmPFC and insular cortex damage. PMID:18390562
Differential effects of insular and ventromedial prefrontal cortex lesions on risky decision-making.
Clark, L; Bechara, A; Damasio, H; Aitken, M R F; Sahakian, B J; Robbins, T W
2008-05-01
The ventromedial prefrontal cortex (vmPFC) and insular cortex are implicated in distributed neural circuitry that supports emotional decision-making. Previous studies of patients with vmPFC lesions have focused primarily on decision-making under uncertainty, when outcome probabilities are ambiguous (e.g. the Iowa Gambling Task). It remains unclear whether vmPFC is also necessary for decision-making under risk, when outcome probabilities are explicit. It is not known whether the effect of insular damage is analogous to the effect of vmPFC damage, or whether these regions contribute differentially to choice behaviour. Four groups of participants were compared on the Cambridge Gamble Task, a well-characterized measure of risky decision-making where outcome probabilities are presented explicitly, thus minimizing additional learning and working memory demands. Patients with focal, stable lesions to the vmPFC (n = 20) and the insular cortex (n = 13) were compared against healthy subjects (n = 41) and a group of lesion controls (n = 12) with damage predominantly affecting the dorsal and lateral frontal cortex. The vmPFC and insular cortex patients showed selective and distinctive disruptions of betting behaviour. VmPFC damage was associated with increased betting regardless of the odds of winning, consistent with a role of vmPFC in biasing healthy individuals towards conservative options under risk. In contrast, patients with insular cortex lesions failed to adjust their bets by the odds of winning, consistent with a role of the insular cortex in signalling the probability of aversive outcomes. The insular group attained a lower point score on the task and experienced more 'bankruptcies'. There were no group differences in probability judgement. These data confirm the necessary role of the vmPFC and insular regions in decision-making under risk. Poor decision-making in clinical populations can arise via multiple routes, with functionally dissociable effects of vmPFC and insular cortex damage.
Serebrianyĭ, A M; Akleev, A V; Aleshchenko, A V; Antoshchina, M M; Kudriashova, O V; Riabchenko, N I; Semenova, L P; Pelevina, I I
2011-01-01
By micronucleus (MN) assay with cytokinetic cytochalasin B block, the mean frequency of blood lymphocytes with MN has been determined in 76 Moscow inhabitants, 35 people from Obninsk and 122 from Chelyabinsk region. In contrast to the distribution of individuals on spontaneous frequency of cells with aberrations, which was shown to be binomial (Kusnetzov et al., 1980), the distribution of individuals on the spontaneous frequency of cells with MN in all three massif can be acknowledged as log-normal (chi2 test). Distribution of individuals in the joined massifs (Moscow and Obninsk inhabitants) and in the unique massif of all inspected with great reliability must be acknowledged as log-normal (0.70 and 0.86 correspondingly), but it cannot be regarded as Poisson, binomial or normal. Taking into account that log-normal distribution of children by spontaneous frequency of lymphocytes with MN has been observed by the inspection of 473 children from different kindergartens in Moscow we can make the conclusion that log-normal is regularity inherent in this type of damage of lymphocytes genome. On the contrary the distribution of individuals on induced by irradiation in vitro lymphocytes with MN frequency in most cases must be acknowledged as normal. This distribution character points out that damage appearance in the individual (genomic instability) in a single lymphocytes increases the probability of the damage appearance in another lymphocytes. We can propose that damaged stem cells lymphocyte progenitor's exchange by information with undamaged cells--the type of the bystander effect process. It can also be supposed that transmission of damage to daughter cells occurs in the time of stem cells division.
ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms
NASA Astrophysics Data System (ADS)
Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.
2006-12-01
Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.
Ecosystem services impacts associated with environmental ...
Nitrogen release to the environment from human activities can have important and costly impacts on human health, recreation, transportation, fisheries, and ecosystem health. Recent efforts to quantify these damage costs have identified annual damages associated with reactive nitrogen release to the EU and US in the hundreds of billions of US dollars (USD). The general approach used to estimate these damages associated with reactive nitrogen are derived from a variety of methods to estimate economic damages, for example, impacts to human respiratory health in terms of hospital visits and mortality, willingness to pay to improve a water body and costs to replace or treat drinking water systems affected by nitrate or cyanotoxin contamination. These values are then extrapolated to other areas to develop the damage cost estimates that are probably best seen as potential damage costs, particularly for aquatic ecosystems. We seek to provide an additional verification of these potential damages using data assembled by the US EPA for case studies of measured costs of nutrient impacts across the US from 2000-2012. We compare the spatial distribution and the magnitude of these costs with the spatial distribution and magnitude of costs from HUC8 watershed units across the US by Sobota et al. (2015). We anticipate that this analysis will provide a ground truthing of existing damage cost estimates, and continue to support the incorporation of cost and benefit informatio
Criticality in finite dynamical networks
NASA Astrophysics Data System (ADS)
Rohlf, Thimo; Gulbahce, Natali; Teuscher, Christof
2007-03-01
It has been shown analytically and experimentally that both random boolean and random threshold networks show a transition from ordered to chaotic dynamics at a critical average connectivity Kc in the thermodynamical limit [1]. By looking at the statistical distributions of damage spreading (damage sizes), we go beyond this extensively studied mean-field approximation. We study the scaling properties of damage size distributions as a function of system size N and initial perturbation size d(t=0). We present numerical evidence that another characteristic point, Kd exists for finite system sizes, where the expectation value of damage spreading in the network is independent of the system size N. Further, the probability to obtain critical networks is investigated for a given system size and average connectivity k. Our results suggest that, for finite size dynamical networks, phase space structure is very complex and may not exhibit a sharp order-disorder transition. Finally, we discuss the implications of our findings for evolutionary processes and learning applied to networks which solve specific computational tasks. [1] Derrida, B. and Pomeau, Y. (1986), Europhys. Lett., 1, 45-49
NASA Astrophysics Data System (ADS)
Lei, Yaguo; Liu, Zongyao; Wang, Delong; Yang, Xiao; Liu, Huan; Lin, Jing
2018-06-01
Tooth damage often causes a reduction in gear mesh stiffness. Thus time-varying mesh stiffness (TVMS) can be treated as an indication of gear health conditions. This study is devoted to investigating the mesh stiffness variations of a pair of external spur gears with tooth pitting, and proposes a new model for describing tooth pitting based on probability distribution. In the model, considering the appearance and development process of tooth pitting, we model the pitting on the surface of spur gear teeth as a series of pits with a uniform distribution in the direction of tooth width and a normal distribution in the direction of tooth height, respectively. In addition, four pitting degrees, from no pitting to severe pitting, are modeled. Finally, influences of tooth pitting on TVMS are analyzed in details and the proposed model is validated by comparing with a finite element model. The comparison results show that the proposed model is effective for the TVMS evaluations of pitting gears.
Assessing hail risk for a building portfolio by generating stochastic events
NASA Astrophysics Data System (ADS)
Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie
2015-04-01
Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then adjusted by trial and error, in order to get the best reproduction of the expected distributions. The value of the mean annual risk obtained using the model is also compared to the mean annual risk calculated using directly the hazard maps. According to the first results, the return period of an event inducing a total damage cost equal or greater than 125 million EUR for the Aargauer insurance company would be of around 10 to 40 years.
Daily tornado frequency distributions in the United States
NASA Astrophysics Data System (ADS)
Elsner, J. B.; Jagger, T. H.; Widen, H. M.; Chavas, D. R.
2014-01-01
The authors examine daily tornado counts in the United States over the period 1994-2012 and find strong evidence for a power-law relationship in the distribution frequency. The scaling exponent is estimated at 1.64 (0.019 s.e.) giving a per tornado-day probability of 0.014% (return period of 71 years) that a tornado day produces 145 tornadoes as was observed on 27 April 2011. They also find that the total number of tornadoes by damage category on days with at least one violent tornado follows an exponential rule. On average, the daily number of tornadoes in the next lowest damage category is approximately twice the number in the current category. These findings are important and timely for tornado hazard models and for seasonal and sub-seasonal forecasts of tornado activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R; Gobbato, Maurizio; Conte, Joel
2009-01-01
The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current statemore » of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.« less
Probabilistic Integrated Assessment of ``Dangerous'' Climate Change
NASA Astrophysics Data System (ADS)
Mastrandrea, Michael D.; Schneider, Stephen H.
2004-04-01
Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.
A performance-based approach to landslide risk analysis
NASA Astrophysics Data System (ADS)
Romeo, R. W.
2009-04-01
An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given severities) and vulnerability (the probability of a limit state performance be reached, given a certain severity). Then, for each landslide all the exposed goods (structures and infrastructures) within the landslide area and within a buffer (representative of the maximum extension of a landslide given a reactivation), are counted. The risk is the product of the damage probability and the ratio of the exposed goods of each landslide to the whole assets exposed to the same type of landslides. Since the risk is computed numerically and by the same procedure applied to all landslides, it is free from any subjective assessment such as those implied in the qualitative methods.
NASA Astrophysics Data System (ADS)
Zohar, Motti; Salamon, Amos; Rubin, Rehav
2017-01-01
Israel was hit by destructive earthquakes many times in the course of history. To properly understand the hazard and support effective preparedness towards future earthquakes, we examined the spatial and temporal distribution of the resulted damage. We described in detail our systematic approach to searching the available literature, collecting the data and screening the authenticity of that information. We used GIS (Geographic Information System) to map and evaluate the distribution of the damage and to search for recurring patterns. Overall, it is found that 186 localities were hit, 54 of them at least twice. We also found that Israel was affected by 4, 17, 8 and 2 damaging earthquakes that originated, respectively, from the southern, central, central-northern and northern parts of the Dead Sea Transform (DST). The temporal appearance of the northern earthquakes is clustered; the central earthquakes are more regular in time, whereas no damage from the north-central and the central quakes, with the exception of the year 363 earthquake, seems to have occurred south of the Dead Sea region. Analyzing the distribution of the damage, we realized that the number of the damage reports reflects only half of the incidents that actually happened, attesting to incompleteness of the historical catalogue. Jerusalem is the most reported city with 14 entries, followed by Akko (Acre), Tiberias, Nablus and Tyre with 8, 7, 7 and 6 reports, respectively. In general, localities in the Galilee and north of it suffered more severely than localities in central Israel with the exception of Nablus and the localities along the coastal plain of Israel, most probably due to local site effects. For the sake of hazard management, these observations should be considered for future planning and risk mitigation.
A probabilistic tornado wind hazard model for the continental United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hossain, Q; Kimball, J; Mensing, R
A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less
NASA Astrophysics Data System (ADS)
Abadie, Luis Maria; Galarraga, Ibon; Sainz de Murieta, Elisa
2017-01-01
A quantification of present and future mean annual losses due to extreme coastal events can be crucial for adequate decision making on adaptation to climate change in coastal areas around the globe. However, this approach is limited when uncertainty needs to be accounted for. In this paper, we assess coastal flood risk from sea-level rise and extreme events in 120 major cities around the world using an alternative stochastic approach that accounts for uncertainty. Probability distributions of future relative (local) sea-level rise have been used for each city, under three IPPC emission scenarios, RCP 2.6, 4.5 and 8.5. The approach allows a continuous stochastic function to be built to assess yearly evolution of damages from 2030 to 2100. Additionally, we present two risk measures that put low-probability, high-damage events in the spotlight: the Value at Risk (VaR) and the Expected Shortfall (ES), which enable the damages to be estimated when a certain risk level is exceeded. This level of acceptable risk can be defined involving different stakeholders to guide progressive adaptation strategies. The method presented here is new in the field of economics of adaptation and offers a much broader picture of the challenges related to dealing with climate impacts. Furthermore, it can be applied to assess not only adaptation needs but also to put adaptation into a timeframe in each city.
NASA Astrophysics Data System (ADS)
Rylander, Marissa N.; Feng, Yusheng; Diller, Kenneth; Bass, J.
2005-04-01
Heat shock proteins (HSP) are critical components of a complex defense mechanism essential for preserving cell survival under adverse environmental conditions. It is inevitable that hyperthermia will enhance tumor tissue viability, due to HSP expression in regions where temperatures are insufficient to coagulate proteins, and would likely increase the probability of cancer recurrence. Although hyperthermia therapy is commonly used in conjunction with radiotherapy, chemotherapy, and gene therapy to increase therapeutic effectiveness, the efficacy of these therapies can be substantially hindered due to HSP expression when hyperthermia is applied prior to these procedures. Therefore, in planning hyperthermia protocols, prediction of the HSP response of the tumor must be incorporated into the treatment plan to optimize the thermal dose delivery and permit prediction of overall tissue response. In this paper, we present a highly accurate, adaptive, finite element tumor model capable of predicting the HSP expression distribution and tissue damage region based on measured cellular data when hyperthermia protocols are specified. Cubic spline representations of HSP27 and HSP70, and Arrhenius damage models were integrated into the finite element model to enable prediction of the HSP expression and damage distribution in the tissue following laser heating. Application of the model can enable optimized treatment planning by controlling of the tissue response to therapy based on accurate prediction of the HSP expression and cell damage distribution.
Tropical and Extratropical Cyclone Damages under Climate Change
NASA Astrophysics Data System (ADS)
Ranson, M.; Kousky, C.; Ruth, M.; Jantarasami, L.; Crimmins, A.; Tarquinio, L.
2014-12-01
This paper provides the first quantitative synthesis of the rapidly growing literature on future tropical and extratropical cyclone losses under climate change. We estimate a probability distribution for the predicted impact of changes in global surface air temperatures on future storm damages, using an ensemble of 296 estimates of the temperature-damage relationship from twenty studies. Our analysis produces three main empirical results. First, we find strong but not conclusive support for the hypothesis that climate change will cause damages from tropical cyclones and wind storms to increase, with most models (84 and 92 percent, respectively) predicting higher future storm damages due to climate change. Second, there is substantial variation in projected changes in losses across regions. Potential changes in damages are greatest in the North Atlantic basin, where the multi-model average predicts that a 2.5°C increase in global surface air temperature would cause hurricane damages to increase by 62 percent. The ensemble predictions for Western North Pacific tropical cyclones and European wind storms (extratropical cyclones) are approximately one third of that magnitude. Finally, our analysis shows that existing models of storm damages under climate change generate a wide range of predictions, ranging from moderate decreases to very large increases in losses.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.
2010-01-01
A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station.
Meta-analysis of attitudes toward damage-causing mammalian wildlife.
Kansky, Ruth; Kidd, Martin; Knight, Andrew T
2014-08-01
Many populations of threatened mammals persist outside formally protected areas, and their survival depends on the willingness of communities to coexist with them. An understanding of the attitudes, and specifically the tolerance, of individuals and communities and the factors that determine these is therefore fundamental to designing strategies to alleviate human-wildlife conflict. We conducted a meta-analysis to identify factors that affected attitudes toward 4 groups of terrestrial mammals. Elephants (65%) elicited the most positive attitudes, followed by primates (55%), ungulates (53%), and carnivores (44%). Urban residents presented the most positive attitudes (80%), followed by commercial farmers (51%) and communal farmers (26%). A tolerance to damage index showed that human tolerance of ungulates and primates was proportional to the probability of experiencing damage while elephants elicited tolerance levels higher than anticipated and carnivores elicited tolerance levels lower than anticipated. Contrary to conventional wisdom, experiencing damage was not always the dominant factor determining attitudes. Communal farmers had a lower probability of being positive toward carnivores irrespective of probability of experiencing damage, while commercial farmers and urban residents were more likely to be positive toward carnivores irrespective of damage. Urban residents were more likely to be positive toward ungulates, elephants, and primates when probability of damage was low, but not when it was high. Commercial and communal farmers had a higher probability of being positive toward ungulates, primates, and elephants irrespective of probability of experiencing damage. Taxonomic bias may therefore be important. Identifying the distinct factors explaining these attitudes and the specific contexts in which they operate, inclusive of the species causing damage, will be essential for prioritizing conservation investments. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.
Van den Heuvel, Frank
2014-01-01
Purpose To present a closed formalism calculating charged particle radiation damage induced in DNA. The formalism is valid for all types of charged particles and due to its closed nature is suited to provide fast conversion of dose to DNA-damage. Methods The induction of double strand breaks in DNA–strings residing in irradiated cells is quantified using a single particle model. This leads to a proposal to use the cumulative Cauchy distribution to express the mix of high and low LET type damage probability generated by a single particle. A microscopic phenomenological Monte Carlo code is used to fit the parameters of the model as a function of kinetic energy related to the damage to a DNA molecule embedded in a cell. The model is applied for four particles: electrons, protons, alpha–particles, and carbon ions. A geometric interpretation of this observation using the impact ionization mean free path as a quantifier, allows extension of the model to very low energies. Results The mathematical expression describes the model adequately using a chi–square test (). This applies to all particle types with an almost perfect fit for protons, while the other particles seem to result in some discrepancies at very low energies. The implementation calculating a strict version of the RBE based on complex damage alone is corroborated by experimental data from the measured RBE. The geometric interpretation generates a unique dimensionless parameter for each type of charged particle. In addition, it predicts a distribution of DNA damage which is different from the current models. PMID:25340636
Laser Hazards and Safety in the Military Environment
1975-08-01
distribution are important factors in determining the time/temperature relationship involved in thermal damage studies . For an ideal eye accommodated to a...effort to quantify L.:; effect. 2 3 2 6 These studies resulted in statistical probabilities of finding an irradiance "hotspot" a certain factor above the...appointed b% the National D)elegates. the Consuitant and Ex-hange Program and tile Aerospace Applicatins Studies Prograw. The results of AGARD work are
Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J
2018-01-01
Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.
Image-Based Modeling Reveals Dynamic Redistribution of DNA Damageinto Nuclear Sub-Domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costes Sylvain V., Ponomarev Artem, Chen James L.; Nguyen, David; Cucinotta, Francis A.
2007-08-03
Several proteins involved in the response to DNA doublestrand breaks (DSB) f orm microscopically visible nuclear domains, orfoci, after exposure to ionizing radiation. Radiation-induced foci (RIF)are believed to be located where DNA damage occurs. To test thisassumption, we analyzed the spatial distribution of 53BP1, phosphorylatedATM, and gammaH2AX RIF in cells irradiated with high linear energytransfer (LET) radiation and low LET. Since energy is randomly depositedalong high-LET particle paths, RIF along these paths should also berandomly distributed. The probability to induce DSB can be derived fromDNA fragment data measured experimentally by pulsed-field gelelectrophoresis. We used this probability in Monte Carlo simulationsmore » topredict DSB locations in synthetic nuclei geometrically described by acomplete set of human chromosomes, taking into account microscope opticsfrom real experiments. As expected, simulations produced DNA-weightedrandom (Poisson) distributions. In contrast, the distributions of RIFobtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) werenon-random. This deviation from the expected DNA-weighted random patterncan be further characterized by "relative DNA image measurements." Thisnovel imaging approach shows that RIF were located preferentially at theinterface between high and low DNA density regions, and were morefrequent than predicted in regions with lower DNA density. The samepreferential nuclear location was also measured for RIF induced by 1 Gyof low-LET radiation. This deviation from random behavior was evidentonly 5 min after irradiation for phosphorylated ATM RIF, while gammaH2AXand 53BP1 RIF showed pronounced deviations up to 30 min after exposure.These data suggest that DNA damage induced foci are restricted to certainregions of the nucleus of human epithelial cells. It is possible that DNAlesions are collected in these nuclear sub-domains for more efficientrepair.« less
Chromosome Model reveals Dynamic Redistribution of DNA Damage into Nuclear Sub-domains
NASA Technical Reports Server (NTRS)
Costes, Sylvain V.; Ponomarev, Artem; Chen, James L.; Cucinotta, Francis A.; Barcellos-Hoff, Helen
2007-01-01
Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage is induced. To test this assumption, we analyzed the spatial distribution of 53BP1, phosphorylated ATM and gammaH2AX RIF in cells irradiated with high linear energy transfer (LET) radiation. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern can be further characterized by relative DNA image measurements. This novel imaging approach shows that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent in regions with lower density DNA than predicted. This deviation from random behavior was more pronounced within the first 5 min following irradiation for phosphorylated ATM RIF, while gammaH2AX and 53BP1 RIF showed very pronounced deviation up to 30 min after exposure. These data suggest the existence of repair centers in mammalian epithelial cells. These centers would be nuclear sub-domains where DNA lesions would be collected for more efficient repair.
NASA Astrophysics Data System (ADS)
Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng
2017-08-01
Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.
Analytic barrage attack model. Final report, January 1986-January 1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
St Ledger, J.W.; Naegeli, R.E.; Dowden, N.A.
An analytic model is developed for a nuclear barrage attack, assuming weapons with no aiming error and a cookie-cutter damage function. The model is then extended with approximations for the effects of aiming error and distance damage sigma. The final result is a fast running model which calculates probability of damage for a barrage attack. The probability of damage is accurate to within seven percent or better, for weapon reliabilities of 50 to 100 percent, distance damage sigmas of 0.5 or less, and zero to very large circular error probabilities. FORTRAN 77 coding is included in the report for themore » analytic model and for a numerical model used to check the analytic results.« less
NASA Astrophysics Data System (ADS)
Rambalakos, Andreas
Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.
Fatigue damage prognosis using affine arithmetic
NASA Astrophysics Data System (ADS)
Gbaguidi, Audrey; Kim, Daewon
2014-02-01
Among the essential steps to be taken in structural health monitoring systems, damage prognosis would be the field that is least investigated due to the complexity of the uncertainties. This paper presents the possibility of using Affine Arithmetic for uncertainty propagation of crack damage in damage prognosis. The structures examined are thin rectangular plates made of titanium alloys with central mode I cracks and a composite plate with an internal delamination caused by mixed mode I and II fracture modes, under a harmonic uniaxial loading condition. The model-based method for crack growth rates are considered using the Paris Erdogan law model for the isotropic plates and the delamination growth law model proposed by Kardomateas for the composite plate. The parameters for both models are randomly taken and their uncertainties are considered as defined by an interval instead of a probability distribution. A Monte Carlo method is also applied to check whether Affine Arithmetic (AA) leads to tight bounds on the lifetime of the structure.
Seismic risk assessment of Navarre (Northern Spain)
NASA Astrophysics Data System (ADS)
Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.
2009-04-01
The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps on rock conditions (for the same probability level). Again, the highest hazard is found in the northeastern part of the region. The lowest hazard is obtained along major river valleys The vulnerability assessment of the Navarra building stock is accomplished using as proxy a combination of building age, location, number of floors and the implantation of building codes. Field surveys help constraining the extent of traditional and technological construction types. The vulnerability characterization is carried out following three methods: European Macroseismic Scale (EMS 98), RISK UE vulnerability index and the capacity spectrum method implemented in Hazus. Vulnerability distribution maps for each Navarrean municipality are provided, adapted to the EMS98 vulnerability classes. The vulnerability of Navarre is medium to high, except for recent urban, highly populated developments. For each vulnerability class and expected ground motion, damage distribution is estimated by means of damage probability matrixes. Several damage indexes, embracing relative and absolute damage estimates, are used. Expected average damage is low. Whereas the largest amounts of damaged structures are found in big cities, the highest percentages are obtained in some muniucipalities of northeastern Navarre. Additionally, expected percentages and amounts of affected persons by earthquake damage are calculated for each municipality. Expected amounts of affected people are low, reflecting the low expected damage degree.
Approach to failure in porous granular materials under compression
NASA Astrophysics Data System (ADS)
Kun, Ferenc; Varga, Imre; Lennartz-Sassinek, Sabine; Main, Ian G.
2013-12-01
We investigate the approach to catastrophic failure in a model porous granular material undergoing uniaxial compression. A discrete element computational model is used to simulate both the microstructure of the material and the complex dynamics and feedbacks involved in local fracturing and the production of crackling noise. Under strain-controlled loading, microcracks initially nucleate in an uncorrelated way all over the sample. As loading proceeds the damage localizes into a narrow damage band inclined at 30∘-45∘ to the load direction. Inside the damage band the material is crushed into a poorly sorted mixture of mainly fine powder hosting some larger fragments. The mass probability density distribution of particles in the damage zone is a power law of exponent 2.1, similar to a value of 1.87 inferred from observations of the length distribution of wear products (gouge) in natural and laboratory faults. Dynamic bursts of radiated energy, analogous to acoustic emissions observed in laboratory experiments on porous sedimentary rocks, are identified as correlated trails or cascades of local ruptures that emerge from the stress redistribution process. As the system approaches macroscopic failure consecutive bursts become progressively more correlated. Their size distribution is also a power law, with an equivalent Gutenberg-Richter b value of 1.22 averaged over the whole test, ranging from 3 to 0.5 at the time of failure, all similar to those observed in laboratory tests on granular sandstone samples. The formation of the damage band itself is marked by a decrease in the average distance between consecutive bursts and an emergent power-law correlation integral of event locations with a correlation dimension of 2.55, also similar to those observed in the laboratory (between 2.75 and 2.25).
A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis
NASA Technical Reports Server (NTRS)
Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A
2016-01-01
This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.
Revision of laser-induced damage threshold evaluation from damage probability data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bataviciute, Gintare; Grigas, Povilas; Smalakys, Linas
2013-04-15
In this study, the applicability of commonly used Damage Frequency Method (DFM) is addressed in the context of Laser-Induced Damage Threshold (LIDT) testing with pulsed lasers. A simplified computer model representing the statistical interaction between laser irradiation and randomly distributed damage precursors is applied for Monte Carlo experiments. The reproducibility of LIDT predicted from DFM is examined under both idealized and realistic laser irradiation conditions by performing numerical 1-on-1 tests. A widely accepted linear fitting resulted in systematic errors when estimating LIDT and its error bars. For the same purpose, a Bayesian approach was proposed. A novel concept of parametricmore » regression based on varying kernel and maximum likelihood fitting technique is introduced and studied. Such approach exhibited clear advantages over conventional linear fitting and led to more reproducible LIDT evaluation. Furthermore, LIDT error bars are obtained as a natural outcome of parametric fitting which exhibit realistic values. The proposed technique has been validated on two conventionally polished fused silica samples (355 nm, 5.7 ns).« less
Size distribution of submarine landslides along the U.S. Atlantic margin
Chaytor, J.D.; ten Brink, Uri S.; Solow, A.R.; Andrews, B.D.
2009-01-01
Assessment of the probability for destructive landslide-generated tsunamis depends on the knowledge of the number, size, and frequency of large submarine landslides. This paper investigates the size distribution of submarine landslides along the U.S. Atlantic continental slope and rise using the size of the landslide source regions (landslide failure scars). Landslide scars along the margin identified in a detailed bathymetric Digital Elevation Model (DEM) have areas that range between 0.89??km2 and 2410??km2 and volumes between 0.002??km3 and 179??km3. The area to volume relationship of these failure scars is almost linear (inverse power-law exponent close to 1), suggesting a fairly uniform failure thickness of a few 10s of meters in each event, with only rare, deep excavating landslides. The cumulative volume distribution of the failure scars is very well described by a log-normal distribution rather than by an inverse power-law, the most commonly used distribution for both subaerial and submarine landslides. A log-normal distribution centered on a volume of 0.86??km3 may indicate that landslides preferentially mobilize a moderate amount of material (on the order of 1??km3), rather than large landslides or very small ones. Alternatively, the log-normal distribution may reflect an inverse power law distribution modified by a size-dependent probability of observing landslide scars in the bathymetry data. If the latter is the case, an inverse power-law distribution with an exponent of 1.3 ?? 0.3, modified by a size-dependent conditional probability of identifying more failure scars with increasing landslide size, fits the observed size distribution. This exponent value is similar to the predicted exponent of 1.2 ?? 0.3 for subaerial landslides in unconsolidated material. Both the log-normal and modified inverse power-law distributions of the observed failure scar volumes suggest that large landslides, which have the greatest potential to generate damaging tsunamis, occur infrequently along the margin. ?? 2008 Elsevier B.V.
NASA Astrophysics Data System (ADS)
Wilson, Michael; Price, D.; Strohecker, Steve
1994-09-01
Germanium witness samples were impacted with the NAWCADWAR modified Cambridge liquid jet device introducing varying levels of damage about the center of each sample. Surface damage statistics were collected, scatter measurements were made at 0.67 micrometers and the samples were failed in tension using a bi-axial flexure test setup. The level and character of the damage was correlated with the reflected scatter measurements as a function of local stress and flaw size distribution. Bi-axial flexure data was analyzed to predict fracture stress and the probability of failure of the germanium samples. The mechanical data were then correlated with the scatter data in order to correlate the BRDF with the material failure. The BRDF measurements were taken in several different orientations in order to study the differences in scatter character for the in-plane and out-of-plane conditions.
Osawa, Takeshi; Okawa, Shigenori; Kurokawa, Shunji; Ando, Shinichiro
2016-12-01
In this study, we propose a method for estimating the risk of agricultural damage caused by an invasive species when species-specific information is lacking. We defined the "risk" as the product of the invasion probability and the area of potentially damaged crop for production. As a case study, we estimated the risk imposed by an invasive weed, Sicyos angulatus, based on simple cellular simulations and governmental data on the area of crop that could potentially be damaged in Miyagi Prefecture, Japan. Simulation results revealed that the current distribution range was sufficiently accurate for practical purposes. Using these results and records of crop areas, we present risk maps for S. angulatus in agricultural fields. Managers will be able to use these maps to rapidly establish a management plan with minimal cost. Our approach will be valuable for establishing a management plan before or during the early stages of invasion.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud to ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.
Baena, Martha L.; Escobar, Federico; Halffter, Gonzalo; García–Chávez, Juan H.
2015-01-01
Omorgus suberosus (Fabricius, 1775) has been identified as a potential predator of the eggs of the turtle Lepidochelys olivacea (Eschscholtz, 1829) on one of the main turtle nesting beaches in the world, La Escobilla in Oaxaca, Mexico. This study presents an analysis of the spatio–temporal distribution of the beetle on this beach (in areas of high and low density of L. olivacea nests over two arrival seasons) and an evaluation, under laboratory conditions, of the probability of damage to the turtle eggs by this beetle. O. suberosus adults and larvae exhibited an aggregated pattern at both turtle nest densities; however, aggregation was greater in areas of low nest density, where we found the highest proportion of damaged eggs. Also, there were fluctuations in the temporal distribution of the adult beetles following the arrival of the turtles on the beach. Under laboratory conditions, the beetles quickly damaged both dead eggs and a mixture of live and dead eggs, but were found to consume live eggs more slowly. This suggests that O. suberosus may be recycling organic material; however, its consumption of live eggs may be sufficient in some cases to interrupt the incubation period of the turtle. We intend to apply these results when making decisions regarding the L. olivacea nests on La Escobilla Beach, one of the most important sites for the conservation of this species. PMID:26422148
A Probabilistic, Facility-Centric Approach to Lightning Strike Location
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.
2012-01-01
A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
McClure, Meredith L; Burdett, Christopher L; Farnsworth, Matthew L; Lutman, Mark W; Theobald, David M; Riggs, Philip D; Grear, Daniel A; Miller, Ryan S
2015-01-01
Wild pigs (Sus scrofa), also known as wild swine, feral pigs, or feral hogs, are one of the most widespread and successful invasive species around the world. Wild pigs have been linked to extensive and costly agricultural damage and present a serious threat to plant and animal communities due to their rooting behavior and omnivorous diet. We modeled the current distribution of wild pigs in the United States to better understand the physiological and ecological factors that may determine their invasive potential and to guide future study and eradication efforts. Using national-scale wild pig occurrence data reported between 1982 and 2012 by wildlife management professionals, we estimated the probability of wild pig occurrence across the United States using a logistic discrimination function and environmental covariates hypothesized to influence the distribution of the species. Our results suggest the distribution of wild pigs in the U.S. was most strongly limited by cold temperatures and availability of water, and that they were most likely to occur where potential home ranges had higher habitat heterogeneity, providing access to multiple key resources including water, forage, and cover. High probability of occurrence was also associated with frequent high temperatures, up to a high threshold. However, this pattern is driven by pigs' historic distribution in warm climates of the southern U.S. Further study of pigs' ability to persist in cold northern climates is needed to better understand whether low temperatures actually limit their distribution. Our model highlights areas at risk of invasion as those with habitat conditions similar to those found in pigs' current range that are also near current populations. This study provides a macro-scale approach to generalist species distribution modeling that is applicable to other generalist and invasive species.
NASA Astrophysics Data System (ADS)
Phillips, R. C.; Samadi, S. Z.; Meadows, M. E.
2018-07-01
This paper examines the frequency, distribution tails, and peak-over-threshold (POT) of extreme floods through analysis that centers on the October 2015 flooding in North Carolina (NC) and South Carolina (SC), United States (US). The most striking features of the October 2015 flooding were a short time to peak (Tp) and a multi-hour continuous flood peak which caused intensive and widespread damages to human lives, properties, and infrastructure. The 2015 flooding was produced by a sequence of intense rainfall events which originated from category 4 hurricane Joaquin over a period of four days. Here, the probability distribution and distribution parameters (i.e., location, scale, and shape) of floods were investigated by comparing the upper part of empirical distributions of the annual maximum flood (AMF) and POT with light- to heavy- theoretical tails: Fréchet, Pareto, Gumbel, Weibull, Beta, and Exponential. Specifically, four sets of U.S. Geological Survey (USGS) gauging data from the central Carolinas with record lengths from approximately 65-125 years were used. Analysis suggests that heavier-tailed distributions are in better agreement with the POT and somewhat AMF data than more often used exponential (light) tailed probability distributions. Further, the threshold selection and record length affect the heaviness of the tail and fluctuations of the parent distributions. The shape parameter and its evolution in the period of record play a critical and poorly understood role in determining the scaling of flood response to intense rainfall.
MIZUMACHI, ERI; MORI, AKIRA; OSAWA, NAOYA; AKIYAMA, REIKO; TOKUCHI, NAOKO
2006-01-01
• Background and Aims Plants have the ability to compensate for damage caused by herbivores. This is important to plant growth, because a plant cannot always avoid damage, even if it has developed defence mechanisms against herbivores. In previous work, we elucidated the herbivory-induced compensatory response of Quercus (at both the individual shoot and whole sapling levels) in both low- and high-nutrient conditions throughout one growing season. In this study, we determine how the compensatory growth of Quercus serrata saplings is achieved at different nutrient levels. • Methods Quercus serrata saplings were grown under controlled conditions. Length, number of leaves and percentage of leaf area lost on all extension units (EUs) were measured. • Key Results Both the probability of flushing and the length of subsequent EUs significantly increased with an increase in the length of the parent EU. The probability of flushing increased with an increase in leaf damage of the parent EU, but the length of subsequent EUs decreased. This indicates that EU growth is fundamentally regulated at the individual EU level. The probabilities of a second and third flush were significantly higher in plants in high-nutrient soil than those in low-nutrient soil. The subsequent EUs of damaged saplings were also significantly longer at high-nutrient conditions. • Conclusions An increase in the probability of flushes in response to herbivore damage is important for damaged saplings to produce new EUs; further, shortening the length of EUs helps to effectively reproduce foliage lost by herbivory. The probability of flushing also varied according to soil nutrient levels, suggesting that the compensatory growth of individual EUs in response to local damage levels is affected by the nutrients available to the whole sapling. PMID:16709576
NASA Astrophysics Data System (ADS)
Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre
2016-04-01
Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the river), as well as in those of the downstream part of the Meuse in which flow depths in the urbanized floodplains are particularly high when inundation occurs. This is the case of the city of Liege, as a result of a subsidence process following former mining activities. For a given climate scenario, the uncertainty ranges affecting flood risk estimates are significant; but not so much that the results for the different municipalities would overlap substantially. Therefore, these uncertainties do not hamper prioritization in terms of allocation of risk reduction measures at the municipality level. In the present climate, the uncertainties arising from flood frequency analysis have a negligible influence in the upper part of the river, while they have a considerable impact on risk modelling in the lower part, where a threshold effect was observed due to the flood protection structures (sudden transition from no inundation to massive flooding when a threshold discharge is exceeded). Varying the number of exceedance probabilities in the integration of the risk curve has different effects for different municipalities; but it does not change the ranking of the municipalities in terms of flood risk. For the other scenarios, damage estimation contributes most to the overall uncertainties. As shown by this study, the magnitude of the uncertainty and its main origin vary in space and in time. This emphasizes the paramount importance of conducting distributed uncertainty analyses. In the considered study area, prioritization of risk reduction means can be reliably performed despite the modelling uncertainties. Reference Detrembleur, S., Stilmant, F., Dewals, B., Erpicum, S., Archambeau, P., & Pirotton, M. (2015). Impacts of climate change on future flood damage on the river Meuse, with a distributed uncertainty analysis. Natural Hazards, 77(3), 1533-1549. Acknowledgement Part of this research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation. It was also supported by the NWE Interreg IVB Program.
Probability of stress-corrosion fracture under random loading
NASA Technical Reports Server (NTRS)
Yang, J. N.
1974-01-01
Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.
Probability of Damage to Sidewalks and Curbs by Street Trees in the Tropics
John K. Francis; Bernard R. Parresol; Juana Marin de Patino
1996-01-01
For 75 trees each of 12 species growing along streets in San Juan, Puerto Rico and Merida, Mexico, diameter at breast height and distance to sidewalk or curb was measured and damage (cracking or raising) was evaluated. Logistic analysis was used to construct a model to predict probability of damage to sidewalk or curb. Distance to the pavement, diameter of the tree,...
Photocounting distributions for exponentially decaying sources.
Teich, M C; Card, H C
1979-05-01
Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model.
Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas
Gutierrez, F.; Cooper, A.H.; Johnson, K.S.
2008-01-01
Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.
Aftershock risks such as those demonstrated by the recent events in New Zealand and Japan
Shome, Nilesh; Luco, Nicolas; Gerstenberger, Matt; Boyd, Oliver; Field, Edward; Liel, Abbie; van de Lindt, John W.
2014-01-01
Recent earthquakes in New Zealand and Japan show that it is important to consider the spatial and temporal distribution of aftershocks following large magnitude events since the probability of high intensity ground motions from aftershocks, which are capable of causing significant societal impact, can be considerable. This is due to the fact that a mainshock will have many aftershocks, some of which may occur closer to populated areas and may be large enough to cause damage. When a large magnitude event strikes a region, the chance that aftershocks will cause damage can be significant as was observed after the 2011 Tohoku and 2010 Canterbury earthquakes (e.g., damage caused by Mw6.6 April 11, 2011 Fukushima-Hamadori earthquake following Tohoku earthquake or by Mw6.3 February 22, 2011 Christchurch earthquake following Canterbury earthquake). Aftershock events may further damage already damaged buildings, thereby further complicating assessments of risk to the built environment. In this paper, the issue of aftershock risk is addressed by summarizing current research regarding: (1) aftershock hazard, (2) structural fragility/vulnerability before and after the mainshock, and (3) change in risk due to aftershocks.
Multiple Damage Progression Paths in Model-Based Prognostics
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Goebel, Kai Frank
2011-01-01
Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active
Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli
2016-02-01
Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. © 2015 Society for Risk Analysis.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.
Milacic, Snezana; Simic, Jadranko
2009-05-01
This study investigated health risks in workers residing and working in terrains contaminated by low ionizing radiation doses which originated from ammunition containing depleted uranium (DU). The studied population was composed of two test groups (T-I, T-II) who were occasionally exposed to DU, and two referent (R-I, R-II) groups not exposed at any time to DU. All of them were evaluated for the following: complete clinical examination and blood count, presence of immature forms and blasts, leukocyte alkaline phosphatase activity and cytogenetic tests. The probability of onset of the characteristic complete biomarkers--chromosomal aberrations, was analyzed using logarithmic function of the Poisson regression. The estimated function of the density of probabilities of Poisson distribution of the chromosomal aberrations in the test group T-II was drastically different from the corresponding distribution of the referent group R-I and to a somewhat lesser extent from the group R-II; Wilcoxon test exactly confirms the presence of a significant difference between the reference group R-II and test group T-II, p < 0.05. The damages to chromosomes and cells were highest in the test group T-II of workers additionally occupationally exposed to DU. The group of workers T-I, who had been exposed to DU working on contaminated terrain, have had certain risks of cell and chromosome damages, and that risk was not greater than the risk to the referent group R-II of workers occupationally exposed to ionizing radiation.
Multi-hazard risk analysis related to hurricanes
NASA Astrophysics Data System (ADS)
Lin, Ning
Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.
Offshore fatigue design turbulence
NASA Astrophysics Data System (ADS)
Larsen, Gunner C.
2001-07-01
Fatigue damage on wind turbines is mainly caused by stochastic loading originating from turbulence. While onshore sites display large differences in terrain topology, and thereby also in turbulence conditions, offshore sites are far more homogeneous, as the majority of them are likely to be associated with shallow water areas. However, despite this fact, specific recommendations on offshore turbulence intensities, applicable for fatigue design purposes, are lacking in the present IEC code. This article presents specific guidelines for such loading. These guidelines are based on the statistical analysis of a large number of wind data originating from two Danish shallow water offshore sites. The turbulence standard deviation depends on the mean wind speed, upstream conditions, measuring height and thermal convection. Defining a population of turbulence standard deviations, at a given measuring position, uniquely by the mean wind speed, variations in upstream conditions and atmospheric stability will appear as variability of the turbulence standard deviation. Distributions of such turbulence standard deviations, conditioned on the mean wind speed, are quantified by fitting the measured data to logarithmic Gaussian distributions. By combining a simple heuristic load model with the parametrized conditional probability density functions of the turbulence standard deviations, an empirical offshore design turbulence intensity is determined. For pure stochastic loading (as associated with standstill situations), the design turbulence intensity yields a fatigue damage equal to the average fatigue damage caused by the distributed turbulence intensity. If the stochastic loading is combined with a periodic deterministic loading (as in the normal operating situation), the proposed design turbulence intensity is shown to be conservative.
Rupture in cemented granular media: application to wheat endosperm
NASA Astrophysics Data System (ADS)
Topin, V.; Delenne, J.-Y.; Radjai, F.
2009-06-01
The mechanical origin of the wheat hardness used to classify wheat flours is an open issue. Wheat endosperm can be considered as a cemented granular material, consisting of densely packed solid particles (the starch granules) and a pore-filling solid matrix (the protein) sticking to the particles. We use the lattice element method to investigate cemented granular materials with a texture close to that of wheat endosperm and with variable matrix volume fraction and particle-matrix adherence. From the shape of the probability density of vertical stresses we distinguish weak, intermediate and strong stresses. The large stresses occur mostly at the contact zones as in noncohesive granular media with a decreasing exponential distribution. The weak forces reflect the arching effect. The intermediate stresses belong mostly to the bulk of the particles and their distribution is well fit to a Gaussian distribution. We also observe that the stress chains are essentially guided by the cementing matrix in tension and by the particulate backbone in compression. Crack formation is analyzed in terms of particle damage as a function of matrix volume fraction and particle-matrix adherence. Our data provide evidence for three regimes of crack propagation depending on the crack path through the material. We find that particle damage scales well with the relative toughness of the particle-matrix interface. The interface toughness appears therefore to be strongly correlated with particle damage and determines transition from soft to hard behavior in wheat endosperm.
Image-Based Modeling Reveals Dynamic Redistribution of DNA Damage into Nuclear Sub-Domains
Costes, Sylvain V; Ponomarev, Artem; Chen, James L; Nguyen, David; Cucinotta, Francis A; Barcellos-Hoff, Mary Helen
2007-01-01
Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage occurs. To test this assumption, we analyzed the spatial distribution of 53BP1, phosphorylated ATM, and γH2AX RIF in cells irradiated with high linear energy transfer (LET) radiation and low LET. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern can be further characterized by “relative DNA image measurements.” This novel imaging approach shows that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent than predicted in regions with lower DNA density. The same preferential nuclear location was also measured for RIF induced by 1 Gy of low-LET radiation. This deviation from random behavior was evident only 5 min after irradiation for phosphorylated ATM RIF, while γH2AX and 53BP1 RIF showed pronounced deviations up to 30 min after exposure. These data suggest that DNA damage–induced foci are restricted to certain regions of the nucleus of human epithelial cells. It is possible that DNA lesions are collected in these nuclear sub-domains for more efficient repair. PMID:17676951
Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.
Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P
2015-10-01
Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.
Spatial Probability Cuing and Right Hemisphere Damage
ERIC Educational Resources Information Center
Shaqiri, Albulena; Anderson, Britt
2012-01-01
In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…
Seismic fragility assessment of low-rise stone masonry buildings
NASA Astrophysics Data System (ADS)
Abo-El-Ezz, Ahmad; Nollet, Marie-José; Nastev, Miroslav
2013-03-01
Many historic buildings in old urban centers in Eastern Canada are made of stone masonry reputed to be highly vulnerable to seismic loads. Seismic risk assessment of stone masonry buildings is therefore the first step in the risk mitigation process to provide adequate planning for retrofit and preservation of historical urban centers. This paper focuses on development of analytical displacement-based fragility curves reflecting the characteristics of existing stone masonry buildings in Eastern Canada. The old historic center of Quebec City has been selected as a typical study area. The standard fragility analysis combines the inelastic spectral displacement, a structure-dependent earthquake intensity measure, and the building damage state correlated to the induced building displacement. The proposed procedure consists of a three-step development process: (1) mechanics-based capacity model, (2) displacement-based damage model and (3) seismic demand model. The damage estimation for a uniform hazard scenario of 2% in 50 years probability of exceedance indicates that slight to moderate damage is the most probable damage experienced by these stone masonry buildings. Comparison is also made with fragility curves implicit in the seismic risk assessment tools Hazus and ELER. Hazus shows the highest probability of the occurrence of no to slight damage, whereas the highest probability of extensive and complete damage is predicted with ELER. This comparison shows the importance of the development of fragility curves specific to the generic construction characteristics in the study area and emphasizes the need for critical use of regional risk assessment tools and generated results.
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
NASA Technical Reports Server (NTRS)
Hodge, Andrew J.; Walker, James L., II
2008-01-01
A probability of detection study was performed for the detection of impact damage using flash heating infrared thermography on a full scale honeycomb composite structure. The honeycomb structure was an intertank structure from a previous NASA technology demonstration program. The intertank was fabricated from IM7/8552 carbon fiber/epoxy facesheets and aluminum honeycomb core. The intertank was impacted in multiple locations with a range of impact energies utilizing a spherical indenter. In a single blind study, the intertank was inspected with thermography before and after impact damage was incurred. Following thermographic inspection several impact sites were sectioned from the intertank and cross-sectioned for microscopic comparisons of NDE detection and actual damage incurred. The study concluded that thermographic inspection was a good method of detecting delamination damage incurred by impact. The 90/95 confidence level on the probability of detection was close to the impact energy that delaminations were first observed through cross-sectional analysis.
Probability of stress-corrosion fracture under random loading.
NASA Technical Reports Server (NTRS)
Yang, J.-N.
1972-01-01
A method is developed for predicting the probability of stress-corrosion fracture of structures under random loadings. The formulation is based on the cumulative damage hypothesis and the experimentally determined stress-corrosion characteristics. Under both stationary and nonstationary random loadings, the mean value and the variance of the cumulative damage are obtained. The probability of stress-corrosion fracture is then evaluated using the principle of maximum entropy. It is shown that, under stationary random loadings, the standard deviation of the cumulative damage increases in proportion to the square root of time, while the coefficient of variation (dispersion) decreases in inversed proportion to the square root of time. Numerical examples are worked out to illustrate the general results.
Greenhouse-gas emission targets for limiting global warming to 2 degrees C.
Meinshausen, Malte; Meinshausen, Nicolai; Hare, William; Raper, Sarah C B; Frieler, Katja; Knutti, Reto; Frame, David J; Allen, Myles R
2009-04-30
More than 100 countries have adopted a global warming limit of 2 degrees C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000-50 period that would limit warming throughout the twenty-first century to below 2 degrees C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 degrees C relative to pre-industrial temperatures. Limiting cumulative CO(2) emissions over 2000-50 to 1,000 Gt CO(2) yields a 25% probability of warming exceeding 2 degrees C-and a limit of 1,440 Gt CO(2) yields a 50% probability-given a representative estimate of the distribution of climate system properties. As known 2000-06 CO(2) emissions were approximately 234 Gt CO(2), less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiqués envisage halved global GHG emissions by 2050, for which we estimate a 12-45% probability of exceeding 2 degrees C-assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 degrees C rises to 53-87% if global GHG emissions are still more than 25% above 2000 levels in 2020.
Hazard rating forest stands for gypsy moth
Ray R., Jr. Hicks
1991-01-01
A gypsy moth hazard exists when forest conditions prevail that are conducive to extensive damage from gypsy moth. Combining forest hazard rating with information on insect population trends provides the basis for predicting the probability (risk) of an event occurring. The likelihood of defoliation is termed susceptibility and the probability of damage (mortality,...
Estimated value of insurance premium due to Citarum River flood by using Bayesian method
NASA Astrophysics Data System (ADS)
Sukono; Aisah, I.; Tampubolon, Y. R. H.; Napitupulu, H.; Supian, S.; Subiyanto; Sidi, P.
2018-03-01
Citarum river flood in South Bandung, West Java Indonesia, often happens every year. It causes property damage, producing economic loss. The risk of loss can be mitigated by following the flood insurance program. In this paper, we discussed about the estimated value of insurance premiums due to Citarum river flood by Bayesian method. It is assumed that the risk data for flood losses follows the Pareto distribution with the right fat-tail. The estimation of distribution model parameters is done by using Bayesian method. First, parameter estimation is done with assumption that prior comes from Gamma distribution family, while observation data follow Pareto distribution. Second, flood loss data is simulated based on the probability of damage in each flood affected area. The result of the analysis shows that the estimated premium value of insurance based on pure premium principle is as follows: for the loss value of IDR 629.65 million of premium IDR 338.63 million; for a loss of IDR 584.30 million of its premium IDR 314.24 million; and the loss value of IDR 574.53 million of its premium IDR 308.95 million. The premium value estimator can be used as neither a reference in the decision of reasonable premium determination, so as not to incriminate the insured, nor it result in loss of the insurer.
Analysis of the progressive failure of brittle matrix composites
NASA Technical Reports Server (NTRS)
Thomas, David J.
1995-01-01
This report investigates two of the most common modes of localized failures, namely, periodic fiber-bridged matrix cracks and transverse matrix cracks. A modification of Daniels' bundle theory is combined with Weibull's weakest link theory to model the statistical distribution of the periodic matrix cracking strength for an individual layer. Results of the model predictions are compared with experimental data from the open literature. Extensions to the model are made to account for possible imperfections within the layer (i.e., nonuniform fiber lengths, irregular crack spacing, and degraded in-situ fiber properties), and the results of these studies are presented. A generalized shear-lag analysis is derived which is capable of modeling the development of transverse matrix cracks in material systems having a general multilayer configuration and under states of full in-plane load. A method for computing the effective elastic properties for the damaged layer at the global level is detailed based upon the solution for the effects of the damage at the local level. This methodology is general in nature and is therefore also applicable to (0(sub m)/90(sub n))(sub s) systems. The characteristic stress-strain response for more general cases is shown to be qualitatively correct (experimental data is not available for a quantitative evaluation), and the damage evolution is recorded in terms of the matrix crack density as a function of the applied strain. Probabilistic effects are introduced to account for the statistical nature of the material strengths, thus allowing cumulative distribution curves for the probability of failure to be generated for each of the example laminates. Additionally, Oh and Finney's classic work on fracture location in brittle materials is extended and combined with the shear-lag analysis. The result is an analytical form for predicting the probability density function for the location of the next transverse crack occurrence within a crack bounded region. The results of this study verified qualitatively the validity of assuming a uniform crack spacing (as was done in the shear-lag model).
Tugnoli, Alessandro; Gubinelli, Gianfilippo; Landucci, Gabriele; Cozzani, Valerio
2014-08-30
The evaluation of the initial direction and velocity of the fragments generated in the fragmentation of a vessel due to internal pressure is an important information in the assessment of damage caused by fragments, in particular within the quantitative risk assessment (QRA) of chemical and process plants. In the present study an approach is proposed to the identification and validation of probability density functions (pdfs) for the initial direction of the fragments. A detailed review of a large number of past accidents provided the background information for the validation procedure. A specific method was developed for the validation of the proposed pdfs. Validated pdfs were obtained for both the vertical and horizontal angles of projection and for the initial velocity of the fragments. Copyright © 2014 Elsevier B.V. All rights reserved.
Performance-based methodology for assessing seismic vulnerability and capacity of buildings
NASA Astrophysics Data System (ADS)
Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li
2010-06-01
This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.
Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino
2015-09-01
The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of <0.10. The enumerative sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.
STS-27R OV-104 Orbiter TPS damage review team, volume 1
NASA Technical Reports Server (NTRS)
Thomas, John W. (Compiler)
1989-01-01
Following the return to earth on December 2, 1988, of Orbiter OV-104, Atlantis, it was observed that there was substantial Thermal Protection System (TPS) tile damage present on the lower right fuselage and wing. Damage sites were more numerous than on previous flights and conversely, there was almost no damage present on Atlantis' left side. A review team investigated the cause beginning with a detailed inspection of the Atlantis TPS damage, and a review of related inspection reports to establish an indepth anomaly definition. An exhaustive data review followed. A fault tree and several failure scenarios were developed. Finally, the failure scenarios were categorized as either not possible, possible but not probable, or probable. This and other information gained during the review formed the basis for the team's findings and recommendations. The team concluded that the most probable cause of the severe STS-27R Orbiter tile damage is that the ablative insulating material covering the RH SRB Nose Cap dislodged and struck the Orbiter tile near 85 seconds into flight and possibly that debris from other sources, including repaired insulation and missing joint cork, caused minor tile damage. Findings are presented, and recommendations that are believed pertinent to minimizing the potential for inflight debris are described.
Probabilistic flood damage modelling at the meso-scale
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2014-05-01
Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, Junjian; Pfenninger, Stefan
In this paper, we propose a strategy to control the self-organizing dynamics of the Bak-Tang-Wiesenfeld (BTW) sandpile model on complex networks by allowing some degree of failure tolerance for the nodes and introducing additional active dissipation while taking the risk of possible node damage. We show that the probability for large cascades significantly increases or decreases respectively when the risk for node damage outweighs the active dissipation and when the active dissipation outweighs the risk for node damage. By considering the potential additional risk from node damage, a non-trivial optimal active dissipation control strategy which minimizes the total cost inmore » the system can be obtained. Under some conditions the introduced control strategy can decrease the total cost in the system compared to the uncontrolled model. Moreover, when the probability of damaging a node experiencing failure tolerance is greater than the critical value, then no matter how successful the active dissipation control is, the total cost of the system will have to increase. This critical damage probability can be used as an indicator of the robustness of a network or system. Copyright (C) EPLA, 2015« less
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
Laser damage metrology in biaxial nonlinear crystals using different test beams
NASA Astrophysics Data System (ADS)
Hildenbrand, Anne; Wagner, Frank R.; Akhouayri, Hassan; Natoli, Jean-Yves; Commandre, Mireille
2008-01-01
Laser damage measurements in nonlinear optical crystals, in particular in biaxial crystals, may be influenced by several effects proper to these materials or greatly enhanced in these materials. Before discussion of these effects, we address the topic of error bar determination for probability measurements. Error bars for the damage probabilities are important because nonlinear crystals are often small and expensive, thus only few sites are used for a single damage probability measurement. We present the mathematical basics and a flow diagram for the numerical calculation of error bars for probability measurements that correspond to a chosen confidence level. Effects that possibly modify the maximum intensity in a biaxial nonlinear crystal are: focusing aberration, walk-off and self-focusing. Depending on focusing conditions, propagation direction, polarization of the light and the position of the focus point in the crystal, strong aberrations may change the beam profile and drastically decrease the maximum intensity in the crystal. A correction factor for this effect is proposed, but quantitative corrections are not possible without taking into account the experimental beam profile after the focusing lens. The characteristics of walk-off and self-focusing have quickly been reviewed for the sake of completeness of this article. Finally, parasitic second harmonic generation may influence the laser damage behavior of crystals. The important point for laser damage measurements is that the amount of externally observed SHG after the crystal does not correspond to the maximum amount of second harmonic light inside the crystal.
Simulation of Corrosion Process for Structure with the Cellular Automata Method
NASA Astrophysics Data System (ADS)
Chen, M. C.; Wen, Q. Q.
2017-06-01
In this paper, from the mesoscopic point of view, under the assumption of metal corrosion damage evolution being a diffusive process, the cellular automata (CA) method was proposed to simulate numerically the uniform corrosion damage evolution of outer steel tube of concrete filled steel tubular columns subjected to corrosive environment, and the effects of corrosive agent concentration, dissolution probability and elapsed etching time on the corrosion damage evolution were also investigated. It was shown that corrosion damage increases nonlinearly with increasing elapsed etching time, and the longer the etching time, the more serious the corrosion damage; different concentration of corrosive agents had different impacts on the corrosion damage degree of the outer steel tube, but the difference between the impacts was very small; the heavier the concentration, the more serious the influence. The greater the dissolution probability, the more serious the corrosion damage of the outer steel tube, but with the increase of dissolution probability, the difference between its impacts on the corrosion damage became smaller and smaller. To validate present method, corrosion damage measurements for concrete filled square steel tubular columns (CFSSTCs) sealed at both their ends and immersed fully in a simulating acid rain solution were conducted, and Faraday’s law was used to predict their theoretical values. Meanwhile, the proposed CA mode was applied for the simulation of corrosion damage evolution of the CFSSTCs. It was shown by the comparisons of results from the three methods aforementioned that they were in good agreement, implying that the proposed method used for the simulation of corrosion damage evolution of concrete filled steel tubular columns is feasible and effective. It will open a new approach to study and evaluate further the corrosion damage, loading capacity and lifetime prediction of concrete filled steel tubular structures.
[In the absence of early orthodontic treatment, is there a loss of chance?].
Béry, A
2006-06-01
Chance is the probability that something will happen, and, in this sense, the absence of chance can be defined as the damage resulting from the disappearance of the probability of a favorable outcome (the contrary being the non-realization of the risk). This is autonomous damage that should be differentiated from final damage. Moral damage is a notion very close to the loss of chance even though it reposes on the indemnization of a final damage of an affection or malady. This article deals with these matters: an insufficient amount of information, the cause of final damage or the loss of chance, the loss of chance being a function of the deficit of information. In this sense, can the failure to begin early, appropriate dento-facial orthopedic treatment be considered a loss of chance for the child?
NASA Astrophysics Data System (ADS)
Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.
2014-12-01
Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.
U.S. National Committee proposed revision to the ISO Laser Damage Standard
NASA Astrophysics Data System (ADS)
Arenberg, Jonathan W.; Howland, Donna; Thomas, Michael; Turner, Trey; Bellum, John; Field, Ella; Carr, C. Wren; Shaffer, Gary; Brophy, Matthew; Krisiloff, Allen
2017-11-01
This paper reports on the fundamental idea behind a US National Committee, The Optics and Electro-Optics Standards Council (OEOSC) Task Force (TF) 7, proposal for a so-called Type 1 laser damage test procedure. A Type 1 test is designed to give a simple binary, pass or fail, result. Such tests are intended for the transactional type of damage testing typical of acceptance and quality control testing. As such is it intended for bulk of certification of optics for the ability to survive a given fluence, useful for manufacturers of optics and their customers, the system builders. At the root of the proposed method is the probability that an optic of area A will have R or less damage occurrences with a user specified probability P at test fluence Φ. This assessment is made by a survey of area and the observation of n events. The paper presents the derivation of probability of N or less damage sites on A given n events observed in area a. The paper concludes with the remaining steps to development of a useful test procedure based on the idea presented.
Caso, Francesca; Agosta, Federica; Volonté, Maria Antonietta; Ferraro, Pilar M; Tiraboschi, Pietro; Copetti, Massimiliano; Valsasina, Paola; Falautano, Monica; Comi, Giancarlo; Falini, Andrea; Filippi, Massimo
2016-10-01
Beside motor symptoms, patients with progressive supranuclear palsy syndrome (PSPs) commonly present cognitive and behavioral disorders. In this study we aimed to assess the structural brain correlates of cognitive impairment in PSPs. We enrolled 23 patients with probable PSP Richardson's syndrome and 15 matched healthy controls. Patients underwent an extensive clinical and neuropsychological evaluation. Cortical thickness measures and diffusion tensor metrics of white matter tracts were obtained. Random forest analysis was used to identify the strongest MRI predictors of cognitive impairment in PSPs at an individual patient level. PSPs patients were in a moderate stage of the disease showing mild cognitive deficits with prominent executive dysfunction. Relative to controls, PSPs patients had a focal, bilateral cortical thinning mainly located in the prefrontal/precentral cortex and temporal pole. PSPs patients also showed a distributed white matter damage involving the main tracts including the superior cerebellar peduncle, corpus callosum, corticospinal tract, and extramotor tracts, such as the inferior fronto-occipital, superior longitudinal and uncinate fasciculi, and cingulum, bilaterally. Regional cortical thinning measures did not relate with cognitive features, while white matter damage showed a significant impact on cognitive impairment (r values ranging from -0.80 to 0.74). PSPs patients show both focal cortical thinning in dorsolateral anterior regions and a distributed white matter damage involving the main motor and extramotor tracts. White matter measures are highly associated with cognitive deficits. Diffusion tensor MRI metrics are likely to be the most sensitive markers of extramotor deficits in PSPs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Flash floods of August 10, 2009, in the Villages of Gowanda and Silver Creek, New York
Szabo, Carolyn O.; Coon, William F.; Niziol, Thomas A.
2011-01-01
Late during the night of August 9, 2009, two storm systems intersected over western New York and produced torrential rain that caused severe flash flooding during the early morning hours of August 10 in parts of Cattaraugus, Chautauqua, and Erie Counties. Nearly 6 inches of rain fell in 1.5 hours as recorded by a National Weather Service weather observer in Perrysburg, which lies between Gowanda and Silver Creek-the communities that suffered the most damage. This storm intensity had an annual exceedance probability of less than 0.2 percent (recurrence interval greater than 500 years). Although flooding along Cattaraugus Creek occurred elsewhere, Cattaraugus Creek was responsible for very little flooding in Gowanda. Rather the small tributaries, Thatcher Brook and Grannis Brook, caused the flooding in Gowanda, as did Silver Creek and Walnut Creek in the Village of Silver Creek. Damages from the flooding were widespread. Numerous road culverts were washed out, and more than one-quarter of the roads in Cattaraugus County were damaged. Many people were evacuated or rescued in Gowanda and Silver Creek, and two deaths occurred during the flood in Gowanda. The water supplies of both communities were compromised by damages to village reservoirs and water-transmission infrastructures. Water and mud damage to residential and commercial properties was extensive. The tri-county area was declared a Federal disaster area and more than $45 million in Federal disaster assistance was distributed to more than 1,500 individuals and an estimated 1,100 public projects. The combined total estimate of damages from the flash floods was greater than $90 million. Over 240 high-water marks were surveyed by the U.S. Geological Survey; a subset of these marks was used to create flood-water-surface profiles for four streams and to delineate the areal extent of flooding in Gowanda and Silver Creek. Flood elevations exceeded previously defined 0.2-percent annual exceedance probability (500-year recurrence interval) elevations by 2 to 4 feet in Gowanda and as much as 6 to 8 feet in Silver Creek. Most of the high-water marks were used in indirect hydraulic computations to estimate peak flows for four streams. The peak flows in Grannis Brook and Thatcher Brook were computed, using the slope-area method, to be 1,400 and 7,600 cubic feet per second, respectively, and peak flow in Silver Creek was computed, using the width-contraction method, to be 19,500 cubic feet per second. The annual exceedance probabilities for flows in these and other basins with small drainage areas that fell almost entirely within the area of heaviest precipitation were less than 0.2 percent (or recurrence intervals greater than 500 years). The peak flow in Cattaraugus Creek at Gowanda was computed, using the slope-area method, to be 33,200 cubic feet per second with an annual exceedance probability of 2.2 percent (recurrence interval of 45 years).
NASA Astrophysics Data System (ADS)
Soderstrom, Ken; Alalawi, Ali
KLFromRecordingDays allows measurement of Kullback-Leibler (KL) distances between 2D probability distributions of vocal acoustic features. Greater KL distance measures reflect increased phonological divergence across the vocalizations compared. The software has been used to compare *.wav file recordings made by Sound Analysis Recorder 2011 of songbird vocalizations pre- and post-drug and surgical manipulations. Recordings from individual animals in *.wav format are first organized into subdirectories by recording day and then segmented into individual syllables uttered and acoustic features of these syllables using Sound Analysis Pro 2011 (SAP). KLFromRecordingDays uses syllable acoustic feature data output by SAP to a MySQL table to generate and compare "template" (typically pre-treatment) and "target" (typically post-treatment) probability distributions. These distributions are a series of virtual 2D plots of the duration of each syllable (as x-axis) to each of 13 other acoustic features measured by SAP for that syllable (as y-axes). Differences between "template" and "target" probability distributions for each acoustic feature are determined by calculating KL distance, a measure of divergence of the target 2D distribution pattern from that of the template. KL distances and the mean KL distance across all acoustic features are calculated for each recording day and output to an Excel spreadsheet. Resulting data for individual subjects may then be pooled across treatment groups and graphically summarized and used for statistical comparisons. Because SAP-generated MySQL files are accessed directly, data limits associated with spreadsheet output are avoided, and the totality of vocal output over weeks may be objectively analyzed all at once. The software has been useful for measuring drug effects on songbird vocalizations and assessing recovery from damage to regions of vocal motor cortex. It may be useful in studies employing other species, and as part of speech therapies tracking progress in producing distinct speech sounds in isolation.
Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.
2014-01-01
To gain additional measurement of any permanent ground deformation that accompanied this damage, we compiled and conducted post-earthquake surveys along two 5-km lines of horizontal control and a 15-km level line. Measurements of horizontal distortion indicate approximately 0.1 m shortening in a NE-SW direction across the valley margin, similar to the amount measured in the channel lining. Evaluation of precise leveling by the National Geodetic Survey showed a downwarp, with an amplitude of >0.1 m over a span of >12 km, that resembled regional geodetic models of coseismic deformation. Although the leveling indicates broad, regional warping, abrupt discontinuities characteristic of faulting characterize both the broad-scale distribution of damage and the local deformation of the channel lining. Reverse movement largely along preexisting faults and probably enhanced significantly by warping combined with enhanced ground shaking, produced the documented coseismic ground deformation.
NASA Astrophysics Data System (ADS)
Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui
2016-06-01
Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.
New feature extraction method for classification of agricultural products from x-ray images
NASA Astrophysics Data System (ADS)
Talukder, Ashit; Casasent, David P.; Lee, Ha-Woon; Keagy, Pamela M.; Schatzki, Thomas F.
1999-01-01
Classification of real-time x-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non- invasive detection of defective product items on a conveyor belt. We discuss the extraction of new features that allow better discrimination between damaged and clean items. This feature extraction and classification stage is the new aspect of this paper; our new maximum representation and discrimination between damaged and clean items. This feature extraction and classification stage is the new aspect of this paper; our new maximum representation and discriminating feature (MRDF) extraction method computes nonlinear features that are used as inputs to a new modified k nearest neighbor classifier. In this work the MRDF is applied to standard features. The MRDF is robust to various probability distributions of the input class and is shown to provide good classification and new ROC data.
Repair of clustered DNA damage caused by high LET radiation in human fibroblasts
NASA Technical Reports Server (NTRS)
Rydberg, B.; Lobrich, M.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)
1998-01-01
It has recently been demonstrated experimentally that DNA damage induced by high LET radiation in mammalian cells is non-randomly distributed along the DNA molecule in the form of clusters of various sizes. The sizes of such clusters range from a few base-pairs to at least 200 kilobase-pairs. The high biological efficiency of high LET radiation for induction of relevant biological endpoints is probably a consequence of this clustering, although the exact mechanisms by which the clustering affects the biological outcome is not known. We discuss here results for induction and repair of base damage, single-strand breaks and double-strand breaks for low and high LET radiations. These results are discussed in the context of clustering. Of particular interest is to determine how clustering at different scales affects overall rejoining and fidelity of rejoining of DNA double-strand breaks. However, existing methods for measuring repair of DNA strand breaks are unable to resolve breaks that are close together in a cluster. This causes problems in interpretation of current results from high LET radiation and will require new methods to be developed.
NASA Astrophysics Data System (ADS)
Keiler, M.
2003-04-01
Reports on catastrophes with high damage caused by natural hazards seem to have increased in number recently. A new trend in dealing with these natural processes leads to the integration of risk into natural hazards evaluations and approaches of integral risk management. The risk resulting from natural hazards can be derived from the combination of parameters of physical processes (intensity and recurrence probability) and damage potential (probability of presence and expected damage value). Natural hazard research focuses mainly on the examination, modelling and estimation of individual geomorphological processes as well as on future developments caused by climate change. Even though damage potential has been taken into account more frequently, quantifying statements are still missing. Due to the changes of the socio-economic structures in mountain regions (urban sprawl, population growth, increased mobility and tourism) these studies are mandatory. This study presents a conceptual method that records the damage potential (probability of physical presence, evaluation of buildings) and shows the development of the damage potential resulting from avalanches since 1950. The study area is the community of Galtür, Austria. 36 percent of the existing buildings are found in officially declared avalanche hazard zones. The majority of these buildings are either agricultural or accommodation facilities. Additionally, the effects of physical planning and/or technical measures on the spatial development of the potential damage are illustrated. The results serve to improve risk determination and point out an unnoticed increase of damage potential and risk in apparently safe settlement areas.
Meteorite falls in China and some related human casualty events
NASA Technical Reports Server (NTRS)
Yau, Kevin; Weissman, Paul; Yeomans, Donald
1994-01-01
Statistics of witnessed and recovered meteorite falls found in Chinese historical texts for the period from 700 B.C. to A.D. 1920 are presented. Several notable features can be seen in the binned distribution as a function of time. An apparent decrease in the number of meteorite reports in the 18th century is observed. An excess of observed meteorite falls in the period from 1840 to 1880 seems to correspond to a similar excess in European data. A chi sq probability test suggest that the association between the two data sets are real. Records of human casualities and structural damage resulting from meteorite falls are also given. A calculation based on the number of casualty events in the Chinese meteorite records suggests that the probability of a meteroite striking a human is far greater than previous estimates. However, it is difficult to verify the accuracy of the reported casualty events.
Worth of data and natural disaster insurance
Attanasi, E.D.; Karlinger, M.R.
1979-01-01
The Federal Government in the past has provided medical and economic aid to victims of earthquakes and floods. However, regulating the use of hazard-prone areas would probably be more efficient. One way to implement such land use regulation is through the national flood and earthquake insurance program. Because insurance firms base their premium rates on available information, the benefits from additional data used to improve parameter estimates of the probability distribution (governing actual disaster events) can be computed by computing changes in the premiums as a function of additional data. An insurance firm is assumed to set rates so as to trade off penalties of overestimation and underestimation of expected damages. A Bayesian preposterior analysis is applied to determine the worth of additional data, as measured by changes in consumers’ surplus, by examining the effects of changes in premiums as a function of a longer hydrologic record.
Flies, worms and the Free Radical Theory of ageing.
Clancy, David; Birdsall, John
2013-01-01
Drosophila and Caenorhabditis elegans have provided the largest body of evidence addressing the Free Radical Theory of ageing, however the evidence has not been unequivocally supportive. Oxidative damage to DNA is probably not a major contributor, damage to lipids is assuming greater importance and damage to proteins probably the source of pathology. On balance the evidence does not support a primary role of oxidative damage in ageing in C. elegans, perhaps because of its particular energy metabolic and stress resistance profile. Evidence is more numerous, varied and consistent and hence more compelling for Drosophila, although not conclusive. However there is good evidence for a role of oxidative damage in later life pathology. Future work should: 1/ make more use of protein oxidative damage measurements; 2/ use inducible transgenic systems or pharmacotherapy to ensure genetic equivalence of controls and avoid confounding effects during development; 3/ to try to delay ageing, target interventions which reduce and/or repair protein oxidative damage. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Marshall, C. J.; Ladbury, R.; Marshall, P. W.; Reed, R. A.; Howe, C.; Weller, B.; Mendenhall, M.; Waczynski, A.; Jordan, T. M.; Fodness, B.
2006-01-01
This paper presents a combined Monte Carlo and analytic approach to the calculation of the pixel-to-pixel distribution of proton-induced damage in a HgCdTe sensor array and compares the results to measured dark current distributions after damage by 63 MeV protons. The moments of the Coulombic, nuclear elastic and nuclear inelastic damage distribution were extracted from Monte Carlo simulations and combined to form a damage distribution using the analytic techniques first described in [I]. The calculations show that the high energy recoils from the nuclear inelastic reactions (calculated using the Monte Car10 code MCNPX [2]) produce a pronounced skewing of the damage energy distribution. The nuclear elastic component (also calculated using the MCNPX) has a negligible effect on the shape of the damage distribution. The Coulombic contribution was calculated using MRED [3,4], a Geant4 [4,5] application. The comparison with the dark current distribution strongly suggests that mechanisms which are not linearly correlated with nonionizing damage produced according to collision kinematics are responsible for the observed dark current increases. This has important implications for the process of predicting the on-orbit dark current response of the HgCdTe sensor array.
NASA Astrophysics Data System (ADS)
Doualle, T.; Gallais, L.; Cormont, P.; Donval, T.; Lamaignère, L.; Rullier, J. L.
2016-06-01
We investigate the effect of different heat treatments on the laser-induced damage probabilities of fused silica samples. Isothermal annealing in a furnace is applied, with different temperatures in the range 700-1100 °C and 12 h annealing time, to super-polished fused silica samples. The surface flatness and laser damage probabilities at 3 ns, 351 nm are measured before and after the different annealing procedures. We have found a significant improvement of the initial laser damage probabilities of the silica surface after annealing at 1050 °C for 12 h. A similar study has been conducted on CO2 laser-processed sites on the surface of the samples. Before and after annealing, we have studied the morphology of the sites, the evolution of residual stress, and the laser-induced damage threshold measured at 351 nm, 3 ns. In this case, we observe that the laser damage resistance of the laser created craters can reach the damage level of the bare fused silica surface after the annealing process, with a complete stress relieve. The obtained results are then compared to the case of local annealing process by CO2 laser irradiation during 1 s, and we found similar improvements in both cases. The different results obtained in the study are compared to numerical simulations made with a thermo-mechanical model based on finite-element method that allows the simulation of the isothermal or the local annealing process, the evolution of stress and fictive temperature. The simulation results were found to be very consistent with experimental observations for the stresses evolution after annealing and estimation of the heat affected area during laser-processing based on the density dependence with fictive temperature. Following this work, the temperature for local annealing should reach 1330-1470 °C for an optimized reduction of damage probability and be below the threshold for material removal, whereas furnace annealing should be kept below the annealing point to avoid sample deformation.
NASA Astrophysics Data System (ADS)
Aller, D.; Hohl, R.; Mair, F.; Schiesser, H.-H.
2003-04-01
Extreme hailfall can cause massive damage to building structures. For the insurance and reinsurance industry it is essential to estimate the probable maximum hail loss of their portfolio. The probable maximum loss (PML) is usually defined with a return period of 1 in 250 years. Statistical extrapolation has a number of critical points, as historical hail loss data are usually only available from some events while insurance portfolios change over the years. At the moment, footprints are derived from historical hail damage data. These footprints (mean damage patterns) are then moved over a portfolio of interest to create scenario losses. However, damage patterns of past events are based on the specific portfolio that was damaged during that event and can be considerably different from the current spread of risks. A new method for estimating the probable maximum hail loss to a building portfolio is presented. It is shown that footprints derived from historical damages are different to footprints of hail kinetic energy calculated from radar reflectivity measurements. Based on the relationship between radar-derived hail kinetic energy and hail damage to buildings, scenario losses can be calculated. A systematic motion of the hail kinetic energy footprints over the underlying portfolio creates a loss set. It is difficult to estimate the return period of losses calculated with footprints derived from historical damages being moved around. To determine the return periods of the hail kinetic energy footprints over Switzerland, 15 years of radar measurements and 53 years of agricultural hail losses are available. Based on these data, return periods of several types of hailstorms were derived for different regions in Switzerland. The loss set is combined with the return periods of the event set to obtain an exceeding frequency curve, which can be used to derive the PML.
Pignata, M L; Gudiño, G L; Wannaz, E D; Plá, R R; González, C M; Carreras, H A; Orellana, L
2002-01-01
The atmospheric quality and distribution of heavy metals were evaluated throughout a wide region of Argentina. In addition, the biomonitor performance of Tillandsia capillaris Ruiz & Pav. f. capillaris was studied in relation to the accumulation of heavy metals and to its physiologic response to air pollutants. A sampling area of 50,000 km2 was selected in the central region of the Argentine Republic. This area was subdivided into grids of 25 x 25 km. Pools of T. capillaris, where present, were collected at each intersection point. From each pool three sub-samples were analyzed independently. Furthermore, five replicates were collected at 20% of the points in order to analyze the variability within the site. The content of Co, Cu, Fe, Ni, Mn, Pb and Zn was determined by Atomic Absorption Spectrometry. Chemical-physiological parameters were also determined to detect symptoms of foliar damage. Chlorophylls, phaeophytins, hydroperoxy conjugated dienes, malondialdehyde and sulfur were quantified in T. capillaris. Some of these parameters were used to calculate a foliar damage index. Data sets were evaluated by one-way ANOVA, correlation analysis, principal component analysis and mapping. Geographical distribution patterns were obtained for the different metals reflecting the contribution of natural and anthropogenic emission sources. According to our results it can be inferred that Fe, Mn and Co probably originated in the soil. For Pb, the highest values were found in the mountainous area, which can be attributed to the presence of Pb in the granitic rocks. Ni showed mainly an anthropogenic origin, with higher values found in places next to industrial centers. For Zn the highest values were in areas of agricultural development. The same was observed for Cu, whose presence could be related to the employment of pesticides. The foliar damage index distribution map showed that the central and southeastern zones were the ones where the major damage in the bioindicator was found. The central zone coincides with the city of Córdoba whereas the southeastern area is strictly agricultural, so the high values found there could be related to the use of pesticides.
Warship Combat System Selection Methodology Based on Discrete Event Simulation
2010-09-01
Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the
Reliability and Creep/Fatigue Analysis of a CMC Component
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; Gyekenyesi, John Z.; Gyekenyesi, John P.
2007-01-01
High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight and enable higher operating temperatures requiring less cooling; thus leading to increased engine efficiencies. There is a need for convenient design tools that can accommodate various loading conditions and material data with their associated uncertainties to estimate the minimum predicted life as well as the failure probabilities of a structural component. This paper presents a review of the life prediction and probabilistic analyses performed for a CMC turbine stator vane. A computer code, NASALife, is used to predict the life of a 2-D woven silicon carbide fiber reinforced silicon carbide matrix (SiC/SiC) turbine stator vane due to a mission cycle which induces low cycle fatigue and creep. The output from this program includes damage from creep loading, damage due to cyclic loading and the combined damage due to the given loading cycle. Results indicate that the trends predicted by NASALife are as expected for the loading conditions used for this study. In addition, a combination of woven composite micromechanics, finite element structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Results indicate that reducing the scatter in proportional limit strength of the vane material has the greatest effect in improving the overall reliability of the CMC vane.
Influence of pitting defects on quality of high power laser light field
NASA Astrophysics Data System (ADS)
Ren, Huan; Zhang, Lin; Yang, Yi; Shi, Zhendong; Ma, Hua; Jiang, Hongzhen; Chen, Bo; Yang, XiaoYu; Zheng, Wanguo; Zhu, Rihong
2018-01-01
With the split-step-Fourier-transform method for solving the nonlinear paraxial wave equation, the intensity distribution of the light field when the pits diameter or depth change is obtained by using numerical simulation, include the intensity distribution inside optical element, the beam near-field, the different distances behind the element and the beam far-field. Results show that with the increase of pits diameter or depth, the light field peak intensity and the contrast inside of element corresponding enhancement. The contrast of the intensity distribution of the rear surface of the element will increase slightly. The peak intensity produced by a specific location element downstream of thermal effect will continue to increase, the damage probability in optics placed here is greatly increased. For the intensity distribution of the far-field, increase the pitting diameter or depth will cause the focal spot intensity distribution changes, and the energy of the spectrum center region increase constantly. This work provide a basis for quantitative design and inspection for pitting defects, which provides a reference for the design of optical path arrangement.
Hardmeyer, Kent; Spencer, Michael A
2007-04-01
This article provides an overview of the use of risk-based analysis (RBA) in flood damage assessment, and it illustrates the use of Geographic Information Systems (GIS) in identifying flood-prone areas, which can aid in flood-mitigation planning assistance. We use RBA to calculate expected annual flood damages in an urban watershed in the state of Rhode Island, USA. The method accounts for the uncertainty in the three primary relationships used in computing flood damage: (1) the probability that a given flood will produce a given amount of floodwater, (2) the probability that a given amount of floodwater will reach a certain stage or height, and (3) the probability that a certain stage of floodwater will produce a given amount of damage. A greater than 50% increase in expected annual flood damage is estimated for the future if previous development patterns continue and flood-mitigation measures are not taken. GIS is then used to create a map that shows where and how often floods might occur in the future, which can help (1) identify priority areas for flood-mitigation planning assistance and (2) disseminate information to public officials and other decision-makers.
Szőke, István; Farkas, Arpád; Balásházy, Imre; Hofmann, Werner; Madas, Balázs G; Szőke, Réka
2012-06-01
The primary objective of this paper was to investigate the distribution of radiation doses and the related biological responses in cells of a central airway bifurcation of the human lung of a hypothetical worker of the New Mexico uranium mines during approximately 12 hours of exposure to short-lived radon progenies. State-of-the-art computational modelling techniques were applied to simulate the relevant biophysical and biological processes in a central human airway bifurcation. The non-uniform deposition pattern of inhaled radon daughters caused a non-uniform distribution of energy deposition among cells, and of related cell inactivation and cell transformation probabilities. When damage propagation via bystander signalling was assessed, it produced more cell killing and cell transformation events than did direct effects. If bystander signalling was considered, variations of the average probabilities of cell killing and cell transformation were supra-linear over time. Our results are very sensitive to the radiobiological parameters, derived from in vitro experiments (e.g., range of bystander signalling), applied in this work and suggest that these parameters may not be directly applicable to realistic three-dimensional (3D) epithelium models.
Earthquake outlook for the San Francisco Bay region 2014–2043
Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn
2016-06-13
Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.
Mingguang, Zhang; Juncheng, Jiang
2008-10-30
Overpressure is one important cause of domino effect in accidents of chemical process equipments. Damage probability and relative threshold value are two necessary parameters in QRA of this phenomenon. Some simple models had been proposed based on scarce data or oversimplified assumption. Hence, more data about damage to chemical process equipments were gathered and analyzed, a quantitative relationship between damage probability and damage degrees of equipment was built, and reliable probit models were developed associated to specific category of chemical process equipments. Finally, the improvements of present models were evidenced through comparison with other models in literatures, taking into account such parameters: consistency between models and data, depth of quantitativeness in QRA.
NASA Technical Reports Server (NTRS)
Marshall, C. J.; Marshall, P. W.; Howe, C. L.; Reed, R. A.; Weller, R. A.; Mendenhall, M.; Waczynski, A.; Ladbury, R.; Jordan, T. M.
2007-01-01
This paper presents a combined Monte Carlo and analytic approach to the calculation of the pixel-to-pixel distribution of proton-induced damage in a HgCdTe sensor array and compares the results to measured dark current distributions after damage by 63 MeV protons. The moments of the Coulombic, nuclear elastic and nuclear inelastic damage distributions were extracted from Monte Carlo simulations and combined to form a damage distribution using the analytic techniques first described in [1]. The calculations show that the high energy recoils from the nuclear inelastic reactions (calculated using the Monte Carlo code MCNPX [2]) produce a pronounced skewing of the damage energy distribution. While the nuclear elastic component (also calculated using the MCNPX) contributes only a small fraction of the total nonionizing damage energy, its inclusion in the shape of the damage across the array is significant. The Coulombic contribution was calculated using MRED [3-5], a Geant4 [4,6] application. The comparison with the dark current distribution strongly suggests that mechanisms which are not linearly correlated with nonionizing damage produced according to collision kinematics are responsible for the observed dark current increases. This has important implications for the process of predicting the on-orbit dark current response of the HgCdTe sensor array.
Sequential structural damage diagnosis algorithm using a change point detection method
NASA Astrophysics Data System (ADS)
Noh, H.; Rajagopal, R.; Kiremidjian, A. S.
2013-11-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
Online monitoring of seismic damage in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2004-07-01
It is shown that water distribution systems can be damaged by earthquakes, and the seismic damages cannot easily be located, especially immediately after the events. Earthquake experiences show that accurate and quick location of seismic damage is critical to emergency response of water distribution systems. This paper develops a methodology to locate seismic damage -- multiple breaks in a water distribution system by monitoring water pressure online at limited positions in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the seismic damage based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides an effective and practical way in which seismic damage in a water distribution system can be accurately and quickly located.
A global assessment of the societal impacts of glacier outburst floods
NASA Astrophysics Data System (ADS)
Carrivick, Jonathan L.; Tweed, Fiona S.
2016-09-01
Glacier outburst floods are sudden releases of large amounts of water from a glacier. They are a pervasive natural hazard worldwide. They have an association with climate primarily via glacier mass balance and their impacts on society partly depend on population pressure and land use. Given the ongoing changes in climate and land use and population distributions there is therefore an urgent need to discriminate the spatio-temporal patterning of glacier outburst floods and their impacts. This study presents data compiled from 20 countries and comprising 1348 glacier floods spanning 10 centuries. Societal impacts were assessed using a relative damage index based on recorded deaths, evacuations, and property and infrastructure destruction and disruption. These floods originated from 332 sites; 70% were from ice-dammed lakes and 36% had recorded societal impact. The number of floods recorded has apparently reduced since the mid-1990s in all major world regions. Two thirds of sites that have produced > 5 floods (n = 32) have floods occurring progressively earlier in the year. Glacier floods have directly caused at least: 7 deaths in Iceland, 393 deaths in the European Alps, 5745 deaths in South America and 6300 deaths in central Asia. Peru, Nepal and India have experienced fewer floods yet higher levels of damage. One in five sites in the European Alps has produced floods that have damaged farmland, destroyed homes and damaged bridges; 10% of sites in South America have produced glacier floods that have killed people and damaged infrastructure; 15% of sites in central Asia have produced floods that have inundated farmland, destroyed homes, damaged roads and damaged infrastructure. Overall, Bhutan and Nepal have the greatest national-level economic consequences of glacier flood impacts. We recommend that accurate, full and standardised monitoring, recording and reporting of glacier floods is essential if spatio-temporal patterns in glacier flood occurrence, magnitude and societal impact are to be better understood. We note that future modelling of the global impact of glacier floods cannot assume that the same trends will continue and will need to consider combining land-use change with probability distributions of geomorphological responses to climate change and to human activity.
Tabassum, Samiya; Leishman, Michelle R
2018-02-01
Release from natural enemies is often cited as a key factor for understanding the success of invasive plant species in novel environments. However, with time invasive species will accumulate native enemies in their invaded range, with factors such as spread distance from the site of introduction, climate and leaf-level traits potentially affecting enemy acquisition rates. However, the influence of such factors is difficult to assess without examining enemy attack across the entire species' range. We tested the significance of factors associated with range expansion (distance from source population and maximum population density), climatic variables (annual temperature and rainfall) and leaf-level traits [specific leaf area (SLA) and foliar nitrogen concentration] in explaining variation in enemy damage across multiple populations of two coastal invasive plants (Gladiolus gueinzii Kunze and Hydrocotyle bonariensis Lam.) along their entire introduced distribution in eastern Australia. We found that for H. bonariensis, amount of foliar damage increased with distance from source population. In contrast, for G. gueinzii, probability and amount of foliar damage decreased with decreasing temperature and increasing rainfall, respectively. Our results show that patterns of enemy attack across species' ranges are complex and cannot be generalised between species or even range edges.
NASA Astrophysics Data System (ADS)
Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.
2016-06-01
The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.
Wang, Gaohong; Deng, Songqiang; Li, Cheng; Liu, Yongding; Chen, Lanzhou; Hu, Chaozhen
2012-07-01
Radiation with UV-B increased the damage to DNA in Scytonema javanicum, a desert-dwelling soil microorganism, and the level of damage varied with the intensity of UV-B radiation and duration of exposure. Production of reactive oxygen species (ROS) also increased because of the radiation. Different exogenous chemicals (ascorbate acid, ASC; N-acetylcysteine, NAC; glyphosate, GPS; and 2-methyl-4-chlorophenoxyacetic acid, MCPA-Na) differed in their effect on the extent of DNA damage and ROS production: whereas NAC and ASC protected the DNA from damage and resulted in reduced ROS production, the herbicides (GPS and MCPA-Na) increased the extent of damage, lowered the rate of photosynthesis, and differed in their effect on ROS production. The chemicals probably have different mechanisms to exercise their effects: NAC and ASC probably function as antioxidant agents or as precursors of other antioxidant molecules that protect the DNA and photosynthetic apparatus directly from the ROS produced as a result of UV-B radiation, and GPS and MCPA-Na probably disrupt the normal metabolism in S. javanicum to induce the leaking of ROS into the photosynthetic electron transfer pathway following UV-B radiation, and thereby damage the DNA. Such mechanisms have serious implications for the use of environment-friendly herbicides, which, because they can destroy DNA, may prove harmful to soil microorganisms. Copyright © 2012 Elsevier Ltd. All rights reserved.
Yang, Yang; Xiao, Li; Qu, Wenzhong; Lu, Ye
2017-11-01
Recent theoretical and experimental studies have demonstrated that a local Green's function can be retrieved from the cross-correlation of ambient noise field. This technique can be used to detect fatigue cracking in metallic structures, owing to the fact that the presence of crack can lead to a change in Green's function. This paper presents a method of structural fatigue cracking characterization method by measuring Green's function reconstruction from noise excitation and verifies the feasibility of crack detection in poor noise source distribution. Fatigue cracks usually generate nonlinear effects, in which different wave amplitudes and frequency compositions can cause different nonlinear responses. This study also undertakes analysis of the capacity of the proposed approach to identify fatigue cracking under different noise amplitudes and frequency ranges. Experimental investigations of an aluminum plate are conducted to assess the cross-correlations of received noise between sensor pairs and finally to detect the introduced fatigue crack. A damage index is proposed according to the variation between cross-correlations obtained from the pristine crack closed state and the crack opening-closure state when sufficient noise amplitude is used to generate nonlinearity. A probability distribution map of damage is calculated based on damage indices. The fatigue crack introduced in the aluminum plate is successfully identified and oriented, verifying that a fatigue crack can be detected by reconstructing Green's functions from an imperfect diffuse field in which ambient noise sources exist locally. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Huff, J. L.; Cucinotta, F. A.
2011-01-01
Future long-tem space travel will face challenges from radiation concerns as the space environment poses health risk to humans in space from radiations with high biological efficiency and adverse post-flight long-term effects. Solar particles events may dramatically affect the crew performance, while Galactic Cosmic Rays will induce a chronic exposure to high-linear-energy-transfer (LET) particles. These types of radiation, not present on the ground level, can increase the probability of a fatal cancer later in astronaut life. No feasible shielding is possible from radiation in space, especially for the heavy ion component, as suggested solutions will require a dramatic increase in the mass of the mission. Our research group focuses on fundamental research and strategic analysis leading to better shielding design and to better understanding of the biological mechanisms of radiation damage. We present our recent effort to model DNA damage and tissue damage using computational models based on the physics of heavy ion radiation, DNA structure and DNA damage and repair in human cells. Our particular area of expertise include the clustered DNA damage from high-LET radiation, the visualization of DSBs (DNA double strand breaks) via DNA damage foci, image analysis and the statistics of the foci for different experimental situations, chromosomal aberration formation through DSB misrepair, the kinetics of DSB repair leading to a model-derived spectrum of chromosomal aberrations, and, finally, the simulation of human tissue and the pattern of apoptotic cell damage. This compendium of theoretical and experimental data sheds light on the complex nature of radiation interacting with human DNA, cells and tissues, which can lead to mutagenesis and carcinogenesis later in human life after the space mission.
The microscopic basis for strain localisation in porous media
NASA Astrophysics Data System (ADS)
Main, Ian; Kun, Ferenz; Pal, Gergo; Janosi, Zoltan
2017-04-01
The spontaneous emergence of localized cooperative deformation is an important phenomenon in the development of shear faults in porous media. It can be studied by empirical observation, by laboratory experiment or by numerical simulation. Here we investigate the evolution of damage and fragmentation leading up to and including system-sized failure in a numerical model of a porous rock, using discrete element simulations of the strain-controlled uni-axial compression of cylindrical samples of different finite size. As the system approaches macroscopic failure the number of fractures and the energy release rate both increase as a time-reversed Omori law, with scaling constants for the frequency-size distribution and the inter-event time, including their temporal evolution, that closely resemble those of natural experiments. The damage progressively localizes in a narrow shear band, ultimately a fault 'gouge' containing a large number of poorly-sorted non-cohesive fragments on a broad bandwidth of scales, with properties similar to those of natural and experimental faults. We determine the position and orientation of the central fault plane, the width of the deformation band and the spatial and mass distribution of fragments. The relative width of the deformation band decreases as a power law of the system size and the probability distribution of the angle of the damage plane converges to around 30 degrees, representing an emergent internal coefficient of friction of 0.7 or so. The mass of fragments is power law distributed, with an exponent that does not depend on scale, and is near that inferred for experimental and natural fault gouges. The fragments are in general angular, with a clear self-affine geometry. The consistency of this model with experimental and field results confirms the critical roles of preexisting heterogeneity, elastic interactions, and finite system size to grain size ratio on the development of faults, and ultimately to assessing the predictive power of forecasts of failure time in such media.
Nucleation, growth and localisation of microcracks: implications for predictability of rock failure
NASA Astrophysics Data System (ADS)
Main, I. G.; Kun, F.; Pál, G.; Jánosi, Z.
2016-12-01
The spontaneous emergence of localized co-operative deformation is an important phenomenon in the development of shear faults in porous media. It can be studied by empirical observation, by laboratory experiment or by numerical simulation. Here we investigate the evolution of damage and fragmentation leading up to and including system-sized failure in a numerical model of a porous rock, using discrete element simulations of the strain-controlled uniaxial compression of cylindrical samples of different finite size. As the system approaches macroscopic failure the number of fractures and the energy release rate both increase as a time-reversed Omori law, with scaling constants for the frequency-size distribution and the inter-event time, including their temporal evolution, that closely resemble those of natural experiments. The damage progressively localizes in a narrow shear band, ultimately a fault 'gouge' containing a large number of poorly-sorted non-cohesive fragments on a broad bandwidth of scales, with properties similar to those of natural and experimental faults. We determine the position and orientation of the central fault plane, the width of the deformation band and the spatial and mass distribution of fragments. The relative width of the deformation band decreases as a power law of the system size and the probability distribution of the angle of the damage plane converges to around 30 degrees, representing an emergent internal coefficient of friction of 0.7 or so. The mass of fragments is power law distributed, with an exponent that does not depend on scale, and is near that inferred for experimental and natural fault gouges. The fragments are in general angular, with a clear self-affine geometry. The consistency of this model with experimental and field results confirms the critical roles of pre-existing heterogeneity, elastic interactions, and finite system size to grain size ratio on the development of faults, and ultimately to assessing the predictive power of forecasts of failure time in such media.
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.
Probability of failure prediction for step-stress fatigue under sine or random stress
NASA Technical Reports Server (NTRS)
Lambert, R. G.
1979-01-01
A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.
2012-04-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doualle, T.; Gallais, L., E-mail: laurent.gallais@fresnel.fr; Cormont, P.
We investigate the effect of different heat treatments on the laser-induced damage probabilities of fused silica samples. Isothermal annealing in a furnace is applied, with different temperatures in the range 700–1100 °C and 12 h annealing time, to super-polished fused silica samples. The surface flatness and laser damage probabilities at 3 ns, 351 nm are measured before and after the different annealing procedures. We have found a significant improvement of the initial laser damage probabilities of the silica surface after annealing at 1050 °C for 12 h. A similar study has been conducted on CO{sub 2} laser-processed sites on the surface of the samples. Before andmore » after annealing, we have studied the morphology of the sites, the evolution of residual stress, and the laser-induced damage threshold measured at 351 nm, 3 ns. In this case, we observe that the laser damage resistance of the laser created craters can reach the damage level of the bare fused silica surface after the annealing process, with a complete stress relieve. The obtained results are then compared to the case of local annealing process by CO{sub 2} laser irradiation during 1 s, and we found similar improvements in both cases. The different results obtained in the study are compared to numerical simulations made with a thermo-mechanical model based on finite-element method that allows the simulation of the isothermal or the local annealing process, the evolution of stress and fictive temperature. The simulation results were found to be very consistent with experimental observations for the stresses evolution after annealing and estimation of the heat affected area during laser-processing based on the density dependence with fictive temperature. Following this work, the temperature for local annealing should reach 1330–1470 °C for an optimized reduction of damage probability and be below the threshold for material removal, whereas furnace annealing should be kept below the annealing point to avoid sample deformation.« less
Stochastic damage evolution in textile laminates
NASA Technical Reports Server (NTRS)
Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.
1993-01-01
A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.
Imaging Fukushima Daiichi reactors with muons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyadera, Haruo; Borozdin, Konstantin N.; Greene, Steve J.
2013-05-15
A study of imaging the Fukushima Daiichi reactors with cosmic-ray muons to assess the damage to the reactors is presented. Muon scattering imaging has high sensitivity for detecting uranium fuel and debris even through thick concrete walls and a reactor pressure vessel. Technical demonstrations using a reactor mockup, detector radiation test at Fukushima Daiichi, and simulation studies have been carried out. These studies establish feasibility for the reactor imaging. A few months of measurement will reveal the spatial distribution of the reactor fuel. The muon scattering technique would be the best and probably the only way for Fukushima Daiichi tomore » make this determination in the near future.« less
Imaging Fukushima Daiichi reactors with muons
NASA Astrophysics Data System (ADS)
Miyadera, Haruo; Borozdin, Konstantin N.; Greene, Steve J.; Lukić, Zarija; Masuda, Koji; Milner, Edward C.; Morris, Christopher L.; Perry, John O.
2013-05-01
A study of imaging the Fukushima Daiichi reactors with cosmic-ray muons to assess the damage to the reactors is presented. Muon scattering imaging has high sensitivity for detecting uranium fuel and debris even through thick concrete walls and a reactor pressure vessel. Technical demonstrations using a reactor mockup, detector radiation test at Fukushima Daiichi, and simulation studies have been carried out. These studies establish feasibility for the reactor imaging. A few months of measurement will reveal the spatial distribution of the reactor fuel. The muon scattering technique would be the best and probably the only way for Fukushima Daiichi to make this determination in the near future.
Black bear damage to lodgepole pine in central Oregon
Barnes, V.G.; Engeman, R.M.
1995-01-01
Black bear damage to 108 lodgepole pine trees was found in mixed conifer habitat in central Oregon. No trees of three other conifer species were injured. Eighty-nine percent of the damage occurred in the same year. Nearly 20% of the freshly damaged trees had bark removed from more than 75% of the circumference and, judging from the fate of trees damaged in prior years, probably succumbed.
Metocean design parameter estimation for fixed platform based on copula functions
NASA Astrophysics Data System (ADS)
Zhai, Jinjin; Yin, Qilin; Dong, Sheng
2017-08-01
Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.
NASA Astrophysics Data System (ADS)
Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.
2018-03-01
Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.
46 CFR 38.05-20 - Insulation-TB/ALL.
Code of Federal Regulations, 2010 CFR
2010-10-01
... tested to insure suitable cargo resistive properties. (3) The insulation shall have sufficient mechanical... mechanical damage. (c) The insulation shall be adequately protected in areas of probable mechanical damage...
46 CFR 38.05-20 - Insulation-TB/ALL.
Code of Federal Regulations, 2011 CFR
2011-10-01
... tested to insure suitable cargo resistive properties. (3) The insulation shall have sufficient mechanical... mechanical damage. (c) The insulation shall be adequately protected in areas of probable mechanical damage...
Hypervelocity impact tests on Space Shuttle Orbiter thermal protection material
NASA Technical Reports Server (NTRS)
Humes, D. H.
1977-01-01
Hypervelocity impact tests were conducted to simulate the damage that meteoroids will produce in the Shuttle Orbiter leading edge structural subsystem material. The nature and extent of the damage is reported and the probability of encountering meteoroids with sufficient energy to produce such damage is discussed.
Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Singhal, S. N.; Chamis, C. C.
1996-01-01
This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.
NASA Astrophysics Data System (ADS)
Chaturvedi, Pratik; Arora, Akshit; Dutt, Varun
2018-06-01
Feedback via simulation tools is likely to help people improve their decision-making against natural disasters. However, little is known on how differing strengths of experiential feedback and feedback's availability in simulation tools influence people's decisions against landslides. We tested the influence of differing strengths of experiential feedback and feedback's availability on people's decisions against landslides in Mandi, Himachal Pradesh, India. Experiential feedback (high or low) and feedback's availability (present or absent) were varied across four between-subject conditions in a tool called the Interactive Landslide Simulation (ILS): high damage with feedback present, high damage with feedback absent, low damage with feedback present, and low damage with feedback absent. In high-damage conditions, the probabilities of damages to life and property due to landslides were 10 times higher than those in the low-damage conditions. In feedback-present conditions, experiential feedback was provided in numeric, text, and graphical formats in ILS. In feedback-absent conditions, the probabilities of damages were described; however, there was no experiential feedback present. Investments were greater in conditions where experiential feedback was present and damages were high compared to conditions where experiential feedback was absent and damages were low. Furthermore, only high-damage feedback produced learning in ILS. Simulation tools like ILS seem appropriate for landslide risk communication and for performing what-if analyses.
Tygert, Mark
2010-09-21
We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
DNA Damage Dependence on the Subcellular Distribution of Low-Energy Beta Emitters
NASA Astrophysics Data System (ADS)
Cutaia, Claudia; Alloni, Daniele; Mariotti, Luca; Friedland, Werner; Ottolenghi, Andrea
One of the main issues of low-energy internal emitters is related to the short ranges of beta particles, compared to the dimensions of the biological targets (e.g. the cell nucleus). Also depending on the chemical form, the radionuclide may be more concentrated in the cytoplasm of the target cell (in our calculations a human fibroblast in interphase) and consequently the conventional dosimetry may overestimate the dose to the nucleus; whereas if the radionuclide is more concentrated in the nuclei of the cells there is a risk of underestimating the nucleus dose. The computer code PARTRAC was modified to calculate the energy depositions in the nucleus and the DNA damage for different relative concentrations of the radionuclide in the nucleus and in the cytoplasm. The nuclides considered in the simulations were Tritium (the electrons emitted due to the β - decay have an average energy of 5.7 keV, corresponding to an average range of 0.42 µm) and Nickel-63 (the electrons emitted have an average energy of 17 keV corresponding to an average range of 5 µm). In the case of Tritium, the dose in the nucleus due the tracks generated outside this region is 15% of the average dose in the cell, whereas in the case of Nickel-63 the dose in the nucleus resulted to be 64% of the average dose in the cell. The distributions of DNA fragments as a function of the relative concentration of the nuclides in the nucleus and in the cytoplasm, were also calculated. In the same conditions, the number of complex lesions (which have a high probability of inducing lethal damage to the cells) per Gy (circa 0.5-1) and the total number of double strand breaks (DSBs) per Gy (circa 40) were also calculated. To complete the characterization of the effects of internal emitters inside the cell the distributions of DSBs per chromosome were studied for different radionuclide distributions in the cell. The results obtained from these simulations show the possible overestimation or underestimation of the risk, (particularly for Tritium intake), due to the distribution of the low energy emitters at subcellular levels.
Application of a time probabilistic approach to seismic landslide hazard estimates in Iran
NASA Astrophysics Data System (ADS)
Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.
2009-04-01
Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate the slope critical acceleration (Ac)x for which a prefixed probability exists that seismic shaking would result in a Dn value equal to a threshold x whose exceedence would cause landslide triggering. The obtained ac values represent the minimum slope resistance required to keep the probability of seismic-landslide triggering within the prefixed value. In particular we calculated the spatial distribution of (Ac)x for x thresholds of 10 and 2 cm in order to represent triggering conditions for coherent slides (e.g., slumps, block slides, slow earth flows) and disrupted slides (e.g., rock falls, rock slides, rock avalanches), respectively. Then we produced a probabilistic national map that shows the spatial distribution of (Ac)10 and (Ac)2, for a 10% probability of exceedence in 50 year, which is a significant level of hazard equal to that commonly used for building codes. The spatial distribution of the calculated (Ac)xvalues can be compared with the in situ actual ac values of specific slopes to estimate whether these slopes have a significant probability of failing under seismic action in the future. As example of possible application of this kind of time probabilistic map to hazard estimates, we compared the values obtained for the Manjil region with a GIS map providing spatial distribution of estimated ac values in the same region. The spatial distribution of slopes characterized by ac < (Ac)10 was then compared with the spatial distribution of the major landslides of coherent type triggered by the Manjil earthquake. This comparison provides indications on potential, problems and limits of the experimented approach for the study area. References Cornell, C.A., 1968: Engineering seismic risk analysis, Bull. Seism. Soc. Am., 58, 1583-1606. Del Gaudio V., Wasowski J., & Pierri P., 2003: An approach to time probabilistic evaluation of seismically-induced landslide hazard. Bull Seism. Soc. Am., 93, 557-569. Jibson, R.W., E.L. Harp and J.A. Michael, 1998: A method for producing digital probabilistic seismic landslide hazard maps: an example from the Los Angeles, California, area, U.S. Geological Survey Open-File Report 98-113, Golden, Colorado, 17 pp.
Heterogeneity in the Fault Damage Zone: a Field Study on the Borrego Fault, B.C., Mexico
NASA Astrophysics Data System (ADS)
Ostermeijer, G.; Mitchell, T. M.; Dorsey, M. T.; Browning, J.; Rockwell, T. K.; Aben, F. M.; Fletcher, J. M.; Brantut, N.
2017-12-01
The nature and distribution of damage around faults, and its impacts on fault zone properties has been a hot topic of research over the past decade. Understanding the mechanisms that control the formation of off fault damage can shed light on the processes during the seismic cycle, and the nature of fault zone development. Recent published work has identified three broad zones of damage around most faults based on the type, intensity, and extent of fracturing; Tip, Wall, and Linking damage. Although these zones are able to adequately characterise the general distribution of damage, little has been done to identify the nature of damage heterogeneity within those zones, often simplifying the distribution to fit log-normal linear decay trends. Here, we attempt to characterise the distribution of fractures that make up the wall damage around seismogenic faults. To do so, we investigate an extensive two dimensional fracture network exposed on a river cut platform along the Borrego Fault, BC, Mexico, 5m wide, and extending 20m from the fault core into the damage zone. High resolution fracture mapping of the outcrop, covering scales ranging three orders of magnitude (cm to m), has allowed for detailed observations of the 2D damage distribution within the fault damage zone. Damage profiles were obtained along several 1D transects perpendicular to the fault and micro-damage was examined from thin-sections at various locations around the outcrop for comparison. Analysis of the resulting fracture network indicates heterogeneities in damage intensity at decimetre scales resulting from a patchy distribution of high and low intensity corridors and clusters. Such patchiness may contribute to inconsistencies in damage zone widths defined along 1D transects and the observed variability of fracture densities around decay trends. How this distribution develops with fault maturity and the scaling of heterogeneities above and below the observed range will likely play a key role in understanding the evolution of fault damage, it's feedback into the seismic cycle, and impact on fluid migration in fault zones. The dataset from the Borrego Fault offers a unique opportunity to study the distribution of fault damage in-situ, and provide field observations towards improving fault zone models.
Development of damage probability matrices based on Greek earthquake damage data
NASA Astrophysics Data System (ADS)
Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.
2011-03-01
A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.
Todisco, T; Dottorini, M; Rossi, F; Baldoncini, A; Palumbo, R
1989-01-01
Peripheral airspace epithelial permeability (PAEP) to diethylentriaminopentacetate (DTPA), an index of pulmonary integrity, was measured in 3 groups of subjects for different purposes: (1) to establish vertical regional reference values; (2) to determine the physiological role of acute doubling of total pulmonary blood flow; (3) to quantify the pulmonary epithelial damage in smokers and the possibility of lung protection by an agent stimulating surfactant production. This study broadens previous knowledge of PAEP. First of all, regional reference values are given for young normal nonsmoking subjects and the existence of a vertical gradient of PAEP is confirmed. Furthermore, this study shows that this gradient is independent of the vertical blood flow gradient, since an acute increase of total blood flow in pneumonectomized patients does not modify the regional distribution of PAEP. Finally, it is confirmed that the cigarette smoker's lung is more permeable than the controls and that probably a drug-stimulating surfactant production gives some protection against damage due to chronic smoking.
Mesoscale Fracture Analysis of Multiphase Cementitious Composites Using Peridynamics
Yaghoobi, Amin; Chorzepa, Mi G.; Kim, S. Sonny; Durham, Stephan A.
2017-01-01
Concrete is a complex heterogeneous material, and thus, it is important to develop numerical modeling methods to enhance the prediction accuracy of the fracture mechanism. In this study, a two-dimensional mesoscale model is developed using a non-ordinary state-based peridynamic (NOSBPD) method. Fracture in a concrete cube specimen subjected to pure tension is studied. The presence of heterogeneous materials consisting of coarse aggregates, interfacial transition zones, air voids and cementitious matrix is characterized as particle points in a two-dimensional mesoscale model. Coarse aggregates and voids are generated using uniform probability distributions, while a statistical study is provided to comprise the effect of random distributions of constituent materials. In obtaining the steady-state response, an incremental and iterative solver is adopted for the dynamic relaxation method. Load-displacement curves and damage patterns are compared with available experimental and finite element analysis (FEA) results. Although the proposed model uses much simpler material damage models and discretization schemes, the load-displacement curves show no difference from the FEA results. Furthermore, no mesh refinement is necessary, as fracture is inherently characterized by bond breakages. Finally, a sensitivity study is conducted to understand the effect of aggregate volume fraction and porosity on the load capacity of the proposed mesoscale model. PMID:28772518
On the probability of exceeding allowable leak rates through degraded steam generator tubes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cizelj, L.; Sorsek, I.; Riesch-Oppermann, H.
1997-02-01
This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds themore » predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.« less
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.
Toranjian, Amin; Marofi, Safar
2017-05-01
Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.
The American Climate Prospectus: a risk-centered analysis of the economic impacts of climate change
NASA Astrophysics Data System (ADS)
Jina, A.; Houser, T.; Hsiang, S. M.; Kopp, R. E., III; Delgado, M.; Larsen, K.; Mohan, S.; Rasmussen, D.; Rising, J.; Wilson, P. S.; Muir-Wood, R.
2014-12-01
The American Climate Prospectus (ACP), the analysis underlying the Risky Business project, quantitatively assessed the climate risks posed to the United States' economy in six sectors - crop yields, energy demand, coastal property, crime, labor productivity, and mortality [1]. The ACP is unique in its characterization of the full probability distribution of economic impacts of climate change throughout the 21st century, making it an extremely useful basis for risk assessments. Three key innovations allow for this characterization. First, climate projections from CMIP5 models are scaled to a temperature probability distribution derived from a coarser climate model (MAGICC). This allows a more accurate representation of the whole distribution of future climates (in particular the tails) than a simple ensemble average. These are downscaled both temporally and spatially. Second, a set of local sea level rise and tropical cyclone projections are used in conjunction with the most detailed dataset of coastal property in the US in order to capture the risks of rising seas and storm surge. Third, we base many of our sectors on empirically-derived responses to temperature and precipitation. Each of these dose-response functions is resampled many times to populate a statistical distribution. Combining these with uncertainty in emissions scenario, climate model, and weather, we create the full probability distribution of climate impacts from county up to national levels, as well as model the effects upon the economy as a whole. Results are presented as likelihood ranges, as well as changes to return intervals of extreme events. The ACP analysis allows us to compare between sectors to understand the magnitude of required policy responses, and also to identify risks through time. Many sectors displaying large impacts at the end of the century, like those of mortality, have smaller changes in the near-term, due to non-linearities in the response functions. Other sectors, like coastal damages, have monotonically increasing costs throughout the 21st century. Taken together, the results from the ACP presents a unique and novel view of the short-, medium-, and long-term economic risks of climate change in the US. References: [1] T. Houser et al (2014), American Climate Prospectus, www.climateprospectus.org.
Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong
2016-01-01
Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545
NASA Astrophysics Data System (ADS)
Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi
To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.
Gajski, Goran; Ravlic, Sanda; Capuder, Zeljka; Garaj-Vrhovac, Vera
2007-08-01
Wide distribution, stability and long persistence in the environment of dichlorodiphenyltrichloroethane (DDT), probably the best-known and most useful insecticide in the world, imposes the need for further examination of the effect of this chemical on human health and especially on the human genome. In this study, peripheral blood human lymphocytes from a healthy donor were exposed to 0.025 mg/L concentration of p,p'-DDT at different time periods (1, 2, 24 and 48 h). For the assessment of genotoxic effect, the new criteria for scoring micronucleus test and alkaline comet assay were used. Both methods showed that p,p'-DDT induces DNA damage in low concentration used in this research. Results of micronucleus test showed a statistically significant (p < 0.05) genotoxic effect of p,p'-DDT on human lymphocytes compared with corresponding control and a different exposure time. A comet assay also showed increased DNA damage caused in p,p'-DDT-exposed human lymphocytes than in corresponding control cells for the tail length. Results obtained by measuring the level of DNA migration and incidence of micronuclei (MN), nucleoplasmic bridges (NPBs) and nuclear buds (NBUDs) indicate the sensitivity of these tests and their application in detection of primary genome damage after long-term exposure to establish the effect of p,p'-DDT on human genome.
Effect of Cisplatin on Parotid Gland Function in Concomitant Radiochemotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hey, Jeremias; Setz, Juergen; Gerlach, Reinhard
2009-12-01
Purpose: To determine the influence of concomitant radiochemotherapy with cisplatin on parotid gland tissue complication probability. Methods and Materials: Patients treated with either radiotherapy (n = 61) or concomitant radiochemotherapy with cisplatin (n = 36) for head-and-neck cancer were prospectively evaluated. The dose and volume distributions of the parotid glands were noted in dose-volume histograms. Stimulated salivary flow rates were measured before, during the 2nd and 6th weeks and at 4 weeks and 6 months after the treatment. The data were fit using the normal tissue complication probability model of Lyman. Complication was defined as a reduction of the salivarymore » flow rate to less than 25% of the pretreatment flow rate. Results: The normal tissue complication probability model parameter TD{sub 50} (the dose leading to a complication probability of 50%) was found to be 32.2 Gy at 4 weeks and 32.1 Gy at 6 months for concomitant radiochemotherapy and 41.1 Gy at 4 weeks and 39.6 Gy at 6 months for radiotherapy. The tolerated dose for concomitant radiochemotherapy was at least 7 to 8 Gy lower than for radiotherapy alone at TD{sub 50}. Conclusions: In this study, the concomitant radiochemotherapy tended to cause a higher probability of parotid gland tissue damage. Advanced radiotherapy planning approaches such as intensity-modulated radiotherapy may be partiticularly important for parotid sparing in radiochemotherapy because of cisplatin-related increased radiosensitivity of glands.« less
Image-based modeling of radiation-induced foci
NASA Astrophysics Data System (ADS)
Costes, Sylvain; Cucinotta, Francis A.; Ponomarev, Artem; Barcellos-Hoff, Mary Helen; Chen, James; Chou, William; Gascard, Philippe
Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage occurs. To test this assumption, we used Monte Carlo simulations to predict the spatial distribution of DSB in human nuclei exposed to high or low-LET radiation. We then compared these predictions to the distribution patterns of three DNA damage sensing proteins, i.e. 53BP1, phosphorylated ATM and γH2AX in human mammary epithelial. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We first used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. Simulations showed a very good agreement for high-LET, predicting 0.7 foci/µm along the path of a 1 GeV/amu Fe particle against measurement of 0.69 to 0.82 foci/µm for various RIF 5 min following exposure (LET 150 keV/µm). On the other hand, discrepancies were shown in foci frequency for low-LET, with measurements 20One drawback using a theoretical model for the nucleus is that it assumes a simplistic and static pattern for DNA densities. However DNA damage pattern is highly correlated to DNA density pattern (i.e. the more DNA, the more likely to have a break). Therefore, we generalized our Monte Carlo approach to real microscope images, assuming pixel intensity of DAPI in the nucleus was directly proportional to the amount of DNA in that pixel. With such approach we could predict DNA damage pattern in real images on a per nucleus basis. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern was further characterized by "relative DNA image measurements". This novel imaging approach showed that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent than predicted in regions with lower DNA density. The same preferential nuclear location was also measured for RIF induced by 1 Gy of low-LET radiation. This deviation from random behavior was evident only 5 min after irradiation for phosphorylated ATM RIF, while γH2AX and 53BP1 RIF showed pronounced deviations up to 30 min after exposure. These data suggest that RIF within a few minutes following exposure to radiation cluster into open regions of the nucleus (i.e. euchromatin). It is possible that DNA lesions are collected in these nuclear sub-domains for more efficient repair. If so, this would imply that DSB are actively transported within the nucleus, a phenomenon that has not yet been considered in modeling DNA misrepair following exposure to radiation. These results are thus critical for more accurate risk models of radiation and we are actively working on characterizing further RIF movement in human nuclei using live cell imaging.
Borras, Ester; Chang, Kyle; Pande, Mala; Cuddy, Amanda; Bosch, Jennifer L; Bannon, Sarah A; Mork, Maureen E; Rodriguez-Bigas, Miguel A; Taggart, Melissa W; Lynch, Patrick M; You, Y Nancy; Vilar, Eduardo
2017-10-01
Lynch syndrome (LS) is a genetic condition secondary to germline alterations in the DNA mismatch repair (MMR) genes with 30% of changes being variants of uncertain significance (VUS). Our aim was to perform an in silico reclassification of VUS from a large single institutional cohort that will help prioritizing functional validation. A total of 54 VUS were detected with 33 (61%) novel variants. We integrated family history, pathology, and genetic information along with supporting evidence from eight different in silico tools at the RNA and protein level. Our assessment allowed us to reclassify 54% (29/54) of the VUS as probably damaging, 13% (7/54) as possibly damaging, and 28% (15/54) as probably neutral. There are more than 1,000 VUS reported in MMR genes and our approach facilitates the prioritization of further functional efforts to assess the pathogenicity to those classified as probably damaging. Cancer Prev Res; 10(10); 580-7. ©2017 AACR . ©2017 American Association for Cancer Research.
Garriguet, Didier
2016-04-01
Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.
Versino, Daniele; Bronkhorst, Curt Allan
2018-01-31
The computational formulation of a micro-mechanical material model for the dynamic failure of ductile metals is presented in this paper. The statistical nature of porosity initiation is accounted for by introducing an arbitrary probability density function which describes the pores nucleation pressures. Each micropore within the representative volume element is modeled as a thick spherical shell made of plastically incompressible material. The treatment of porosity by a distribution of thick-walled spheres also allows for the inclusion of micro-inertia effects under conditions of shock and dynamic loading. The second order ordinary differential equation governing the microscopic porosity evolution is solved withmore » a robust implicit procedure. A new Chebyshev collocation method is employed to approximate the porosity distribution and remapping is used to optimize memory usage. The adaptive approximation of the porosity distribution leads to a reduction of computational time and memory usage of up to two orders of magnitude. Moreover, the proposed model affords consistent performance: changing the nucleation pressure probability density function and/or the applied strain rate does not reduce accuracy or computational efficiency of the material model. The numerical performance of the model and algorithms presented is tested against three problems for high density tantalum: single void, one-dimensional uniaxial strain, and two-dimensional plate impact. Here, the results using the integration and algorithmic advances suggest a significant improvement in computational efficiency and accuracy over previous treatments for dynamic loading conditions.« less
Calculation of the Frequency Distribution of the Energy Deposition in DNA Volumes by Heavy Ions
NASA Technical Reports Server (NTRS)
Plante, Ianik; Cicinotta, Francis A.
2012-01-01
Radiation quality effects are largely determined by energy deposition in small volumes of characteristic sizes less than 10 nm representative of short-segments of DNA, the DNA nucleosome, or molecules initiating oxidative stress in the nucleus, mitochondria, or extra-cellular matrix. On this scale, qualitatively distinct types of molecular damage are possible for high linear energy transfer (LET) radiation such as heavy ions compared to low LET radiation. Unique types of DNA lesions or oxidative damages are the likely outcome of the energy deposition. The frequency distribution for energy imparted to 1-20 nm targets per unit dose or particle fluence is a useful descriptor and can be evaluated as a function of impact parameter from an ions track. In this work, the simulation of 1-Gy irradiation of a cubic volume of 5 micron by: 1) 450 (1)H(+) ions, 300 MeV; 2) 10 (12)C(6+) ions, 290 MeV/amu and 3) (56)Fe(26+) ions, 1000 MeV/amu was done with the Monte-Carlo simulation code RITRACKS. Cylindrical targets are generated in the irradiated volume, with random orientation. The frequency distribution curves of the energy deposited in the targets is obtained. For small targets (i.e. <25 nm size), the probability of an ion to hit a target is very small; therefore a large number of tracks and targets as well as a large number of histories are necessary to obtain statistically significant results. This simulation is very time-consuming and is difficult to perform by using the original version of RITRACKS. Consequently, the code RITRACKS was adapted to use multiple CPU on a workstation or on a computer cluster. To validate the simulation results, similar calculations were performed using targets with fixed position and orientation, for which experimental data are available [5]. Since the probability of single- and double-strand breaks in DNA as function of energy deposited is well know, the results that were obtained can be used to estimate the yield of DSB, and can be extended to include other targeted or non-target effects.
Extrinsic pinning of magnetic domain walls in CoFeB-MgO nanowires with perpendicular anisotropy
NASA Astrophysics Data System (ADS)
Zhang, Xueying; Vernier, Nicolas; Zhao, Weisheng; Vila, Laurent; Ravelosona, Dafiné
2018-05-01
In this work, we have studied the mechanism of domain wall motion in 0.2-1.5 μm wide nanowires based on Ta/CoFeB/MgO films with perpendicular magnetic anisotropy. We show that domain wall propagation can be completely stopped due to the presence of strong pinning sites along the nanowires. From the analysis of the distribution of the strongest depinning fields as a function of the wire width, we evidence the presence of extrinsic pinning sites in nanowires, probably induced by edge damages, that dominate over the intrinsic pinning of the magnetic films even for these large wire widths.
NASA Technical Reports Server (NTRS)
1973-01-01
The HD 220 program was created as part of the space shuttle solid rocket booster recovery system definition. The model was generated to investigate the damage to SRB components under water impact loads. The random nature of environmental parameters, such as ocean waves and wind conditions, necessitates estimation of the relative frequency of occurrence for these parameters. The nondeterministic nature of component strengths also lends itself to probabilistic simulation. The Monte Carlo technique allows the simultaneous perturbation of multiple independent parameters and provides outputs describing the probability distribution functions of the dependent parameters. This allows the user to determine the required statistics for each output parameter.
NASA Astrophysics Data System (ADS)
Kopp, R. E., III; Delgado, M.; Horton, R. M.; Houser, T.; Little, C. M.; Muir-Wood, R.; Oppenheimer, M.; Rasmussen, D. M., Jr.; Strauss, B.; Tebaldi, C.
2014-12-01
Global mean sea level (GMSL) rise projections are insufficient for adaptation planning; local decisions require local projections that characterize risk over a range of timeframes and tolerances. We present a global set of local sea level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We present complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling [1]. We illustrate the application of this framework by estimating the joint distribution of future sea-level change and coastal flooding, and associated economic costs [1,2]. In much of the world in the current century, differences in median LSL projections are due primarily to varying levels of non-climatic uplift or subsidence. In the 22nd century and in the high-end tails, larger ice sheet contributions, particularly from the Antarctic ice sheet (AIS), contribute significantly to site-to-site differences. Uncertainty in GMSL and most LSL projections is dominated by the uncertain AIS component. Sea-level rise dramatically reshapes flood risk. For example, at the New York City (Battery) tide gauge, our projections indicate a likely (67% probability) 21st century LSL rise under RCP 8.5 of 65--129 cm (1-in-20 chance of exceeding 154 cm). Convolving the distribution of projected sea-level rise with the extreme value distribution of flood return periods indicates that this rise will cause the current 1.80 m `1-in-100 year' flood event to occur an expected nine times over the 21st century -- equivalent to the expected number of `1-in-11 year' floods in the absence of sea-level change. Projected sea-level rise for 2100 under RCP 8.5 would likely place 80-160 billion of current property in New York below the high tide line, with a 1-in-20 chance of losses >190 billion. Even without accounting for potential changes in storms themselves, it would likely increase average annual storm damage by 2.6-5.2 billion (1-in-20 chance of >7 billion). Projected increases in tropical cyclone intensity would further increase damages [2]. References: [1] R. E. Kopp et al. (2014), Earth's Future, doi:10.1002/2014EF000239. [2] T. Houser et al. (2014), American Climate Prospectus, www.climateprospectus.org.
Ecological risk assessment to support fuels treatment project decisions
Jay O' Laughlin
2010-01-01
Risk is a combined statement of the probability that something of value will be damaged and some measure of the damageâs adverse effect. Wildfires burning in the uncharacteristic fuel conditions now typical throughout the Western United States can damage ecosystems and adversely affect environmental conditions. Wildfire behavior can be modified by prefire fuel...
On the Prediction of Ground Motion
NASA Astrophysics Data System (ADS)
Lavallee, D.; Schmedes, J.; Archuleta, R. J.
2012-12-01
Using a slip-weakening dynamic model of rupture, we generated earthquake scenarios that provided the spatio-temporal evolution of the slip on the fault and the radiated field at the free surface. We observed scenarios where the rupture propagates at a supershear speed on some parts of the fault while remaining subshear for other parts of the fault. For some scenarios with nearly identical initial conditions, the rupture speed was always subshear. For both types of scenarios (mixture of supershear and subshear speeds and only subshear), we compute the peak ground accelerations (PGA) regularly distributed over the Earth's surface. We then calculate the probability density functions (PDF) of the PGA. For both types of scenarios, the PDF curves are asymmetrically shaped and asymptotically attenuated according to power law. This behavior of the PDF is similar to that observed for the PDF curves of PGA recorded during earthquakes. The main difference between scenarios with a supershear rupture speed and scenarios with only subshear rupture speed is the range of PGA values. Based on these results, we investigate three issues fundamental for the prediction of ground motion. It is important to recognize that recorded ground motions during an earthquake sample a small fraction of the radiation field. It is not obvious that such sampling will capture the largest ground motion generated during an earthquake, nor that the number of stations is large enough to properly infer the statistical properties associated with the radiation field. To quantify the effect of under (or low) sampling of the radiation field, we design three experiments. For a scenario where the rupture speed is only subshear, we construct multiple sets of observations. Each set is comprised of 100 randomly selected PGA values from all of the PGA's calculated at the Earth's surface. In the first experiment, we evaluate how the distributions of PGA in the sets compare with the distribution of all the PGA. For this experiment, we used different statistical tests (e.g. chi-square). This experiment quantifies the likelihood that a random set of PGA can be used to infer the statistical properties of all the PGA. In the second experiment, we fit the PDF of the PGA of every set with probability laws used in the literature to describe the PDF of recorded PGA: the lognormal law, the generalized maximum extreme value law, and the Levy law. For each set, the probability laws are then used to compute the probability to observe a PGA value that will cause "moderate to heavy" potential damage according to Instrumental Intensity scale developed by USGS. For each probability law, we compare predictions based on the set with the prediction estimated from all the PGA. This experiment quantifies the reliability and uncertainty in predicting an outcome due to under sampling the radiation field. The third experiment consists in using the sets discussed above and repeats the two investigations discussed above but this time comparing with a scenario where the rupture has a supershear speed over part of the fault. The objective here is to assess additional uncertainty in predicting PGA and damage resulting from ruptures that have supershear speeds.
NASA Technical Reports Server (NTRS)
Levin, George M.; Christiansen, Eric L.
1997-01-01
The pre-flight predictions and postflight assessments carried out in relation to a series of Space Shuttle missions are reviewed, and data are presented for the meteoroid and orbital debris damage observed on the Hubble Space Telescope during the 1994 Hubble repair mission. Pre-flight collision risk analyses are carried out prior to each mission, and in the case of an unacceptable risk, the mission profile is altered until the risk is considered to be acceptable. The NASA's BUMPER code is used to compute the probability of damage from debris and meteoroid particle impacts based on the Poisson statistical model for random events. The penetration probability calculation requires information concerning the geometry of the critical systems, the penetration resistance and mission profile parameters. Following each flight, the orbiter is inspected for meteoroid and space debris damage. The emphasis is on areas such as the radiator panels, the windows and the reinforced carbon-carbon structures on the leading wing edges and on the nose cap. The contents of damage craters are analyzed using a scanning electron microscope to determine the nature and origin of the impactor. Hypervelocity impact tests are often performed to simulate the observed damage and to estimate the nature of the damaging particles. The number and type of damage observed provides information concerning the orbital debris environment.
Nuclear Targeting Terms for Engineers and Scientists
DOE Office of Scientific and Technical Information (OSTI.GOV)
St Ledger, John W.
The Department of Defense has a methodology for targeting nuclear weapons, and a jargon that is used to communicate between the analysts, planners, aircrews, and missile crews. The typical engineer or scientist in the Department of Energy may not have been exposed to the nuclear weapons targeting terms and methods. This report provides an introduction to the terms and methodologies used for nuclear targeting. Its purpose is to prepare engineers and scientists to participate in wargames, exercises, and discussions with the Department of Defense. Terms such as Circular Error Probable, probability of hit and damage, damage expectancy, and the physicalmore » vulnerability system are discussed. Methods for compounding damage from multiple weapons applied to one target are presented.« less
Project "Convective Wind Gusts" (ConWinG)
NASA Astrophysics Data System (ADS)
Mohr, Susanna; Richter, Alexandra; Kunz, Michael; Ruck, Bodo
2017-04-01
Convectively-driven strong winds usually associated with thunderstorms frequently cause substantial damage to buildings and other structures in many parts of the world. Decisive for the high damage potential are the short-term wind speed maxima with duration of a few seconds, termed as gusts. Several studies have shown that convectively-driven gusts can reach even higher wind speeds compared to turbulent gusts associated with synoptic-scale weather systems. Due to the small-scale and non-stationary nature of convective wind gusts, there is a considerable lack of knowledge regarding their characteristics and statistics. Furthermore, their interaction with urban structures and their influence on buildings is not yet fully understood. For these two reasons, convective wind events are not included in the present wind load standards of buildings and structures, which so far have been based solely on the characteristics of synoptically-driven wind gusts in the near-surface boundary layer (e. g., DIN EN 1991-1-4:2010-12; ASCE7). However, convective and turbulent gusts differ considerably, e.g. concerning vertical wind-speed profiles, gust factors (i.e., maximum to mean wind speed), or exceedance probability curves. In an effort to remedy this situation, the overarching objective of the DFG-project "Convective Wind Gusts" (ConWinG) is to investigate the characteristics and statistics of convective gusts as well as their interaction with urban structures. Based on a set of 110 climate stations of the German Weather Service (DWD) between 1992 and 2014, we analyzed the temporal and spatial distribution, intensity, and occurrence probability of convective gusts. Similar to thunderstorm activity, the frequency of convective gusts decreases gradually from South to North Germany. A relation between gust intensity/probability to orography or climate conditions cannot be identified. Rather, high wind speeds, e.g., above 30 m/s, can be expected everywhere in Germany with almost similar occurrence probabilities. A laboratory experiment with an impinging jet simulating the downdraft was performed to investigate the propagation of a gust within built environment. The aim is to investigate the interaction of the resulting convective gusts along the near-surface layers with different urban structures - from single street canyons up to more complex block array structures. It was shown that high velocities are conserved within street canyons over longer distances compared to open terrain conditions. In addition, the experiments revealed the ratio of building height to downdraft size as a crucial factor with regard to vertical velocities at roof level and the pressure distribution on the facades.
Galactic Cosmic Ray Event-Based Risk Model (GERM) Code
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.
2013-01-01
This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.
Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas
2016-06-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.
Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas
2015-01-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191
Random Partition Distribution Indexed by Pairwise Information
Dahl, David B.; Day, Ryan; Tsai, Jerry W.
2017-01-01
We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318
Damage patterns of historical earthquakes in and around Israel
NASA Astrophysics Data System (ADS)
Zohar, M.; Salamon, A.; Rubin, R.
2013-12-01
Numerous historical reports of earthquakes in the Levant, including their damages and effects, have been accumulating for the last 3,000 years. Most of the seismic activity is associated with the Dead Sea Transform (DST), the plate border between Arabia and Sinai. In this work we focus on the central and southern parts of the DST, where Israel and its surroundings have repeatedly suffered damage. Much of the relevant reports were previously gathered and organized in catalogues and lists. However, some of the early cataloguers did not screen the historical sources and thus their information cannot be taken for granted. In modern times, however, reviewed catalogues have been presented that took care of this shortcoming and consequently provided updated and more reliable information. Yet, the bulk of that data has not yet been fully analyzed, particularly the spatial distribution of the damage. We have collected information associated with damaging events that occurred from the second millennia BCE to the first event recorded by modern instruments in 1927 CE. At the first stage we screened each of the historical reports, determined its reliability, and then characterized the events by date, size, type and approximate geographic origin. At the same time, we related the damage reports to geographic coordinates, approximate severity, and accompanying environmental effects. Finally, we stored these records in a GIS-based relational database constructed so as to enable flexible queries and data manipulations. Preliminary results of frequency-magnitude relations show that the list of events seems to be complete for the estimated magnitudes M > 7, M > 5.5 and M ≥ 5 in the last two millennia, since 1, 500 CE and since 1, 800 CE, respectively. Temporal distribution of the events indicates three periods of relatively intense seismic activity: (1) between the 4th to mid 8th century, followed by almost total silence of reports; (2) from the beginning of the 11th to the beginning of the 13th century; and (3) from the beginning of the 19th century until 1927 CE. While the latter period probably reflects the expansion of media and communication whereby almost every one of the felt events was reported, the first two intervals need further explanation. Most of the damage reported was from the long-term inhabited cities in central and northern Israel, giving us a broad perspective in time and space. It seems that earthquakes that originated north of Israel (e.g., in 1170, 1202, 1759, and 1837) or in the southern part of the DST (e.g., 1068) affected larger areas than those originating in central Israel. For the latter events, two groups could be distinguished: (1) events with more or less similar damage distribution, extending from the Galilee in the north to the northern Negev in the south as well as to localities east of the Jordan River (e.g., in 363, 749, 1033 and 1927); and (2) events that damaged smaller areas, mainly in central Israel (e.g., in 634, 1293, 1458 and 1546). In the next stage we will focus on the damage history and intensity of each of the localities.
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Sonja N. Oswalt; Christopher M. Oswalt
2008-01-01
This paper compares and contrasts hurricane-related damage recorded across the Mississippi landscape in the 2 years following Katrina with initial damage assessments based on modeled parameters by the USDA Forest Service. Logistic and multiple regressions are used to evaluate the influence of stand characteristics on tree damage probability. Specifically, this paper...
Advances in threat assessment and their application to forest and rangeland management—Volume 2
H. Michael Rauscher; Yasmeen Sands; Danny C. Lee; Jerome S. Beatty
2010-01-01
Risk is a combined statement of the probability that something of value will be damaged and some measure of the damageâs adverse effect. Wildfires burning in the uncharacteristic fuel conditions now typical throughout the Western United States can damage ecosystems and adversely affect environmental conditions. Wildfire behavior can be modified by prefire fuel...
Capturing changes in flood risk with Bayesian approaches for flood damage assessment
NASA Astrophysics Data System (ADS)
Vogel, Kristin; Schröter, Kai; Kreibich, Heidi; Thieken, Annegret; Müller, Meike; Sieg, Tobias; Laudan, Jonas; Kienzler, Sarah; Weise, Laura; Merz, Bruno; Scherbaum, Frank
2016-04-01
Flood risk is a function of hazard as well as of exposure and vulnerability. All three components are under change over space and time and have to be considered for reliable damage estimations and risk analyses, since this is the basis for an efficient, adaptable risk management. Hitherto, models for estimating flood damage are comparatively simple and cannot sufficiently account for changing conditions. The Bayesian network approach allows for a multivariate modeling of complex systems without relying on expert knowledge about physical constraints. In a Bayesian network each model component is considered to be a random variable. The way of interactions between those variables can be learned from observations or be defined by expert knowledge. Even a combination of both is possible. Moreover, the probabilistic framework captures uncertainties related to the prediction and provides a probability distribution for the damage instead of a point estimate. The graphical representation of Bayesian networks helps to study the change of probabilities for changing circumstances and may thus simplify the communication between scientists and public authorities. In the framework of the DFG-Research Training Group "NatRiskChange" we aim to develop Bayesian networks for flood damage and vulnerability assessments of residential buildings and companies under changing conditions. A Bayesian network learned from data, collected over the last 15 years in flooded regions in the Elbe and Danube catchments (Germany), reveals the impact of many variables like building characteristics, precaution and warning situation on flood damage to residential buildings. While the handling of incomplete and hybrid (discrete mixed with continuous) data are the most challenging issues in the study on residential buildings, a similar study, that focuses on the vulnerability of small to medium sized companies, bears new challenges. Relying on a much smaller data set for the determination of the model parameters, overly complex models should be avoided. A so called Markov Blanket approach aims at the identification of the most relevant factors and constructs a Bayesian network based on those findings. With our approach we want to exploit a major advantage of Bayesian networks which is their ability to consider dependencies not only pairwise, but to capture the joint effects and interactions of driving forces. Hence, the flood damage network does not only show the impact of precaution on the building damage separately, but also reveals the mutual effects of precaution and the quality of warning for a variety of flood settings. Thus, it allows for a consideration of changing conditions and different courses of action and forms a novel and valuable tool for decision support. This study is funded by the Deutsche Forschungsgemeinschaft (DFG) within the research training program GRK 2043/1 "NatRiskChange - Natural hazards and risks in a changing world" at the University of Potsdam.
A tool for the calculation of rockfall fragility curves for masonry buildings
NASA Astrophysics Data System (ADS)
Mavrouli, Olga
2017-04-01
Masonries are common structures in mountainous and coastal areas and they exhibit substantial vulnerability to rockfalls. For big rockfall events or precarious structures the damage is very high and the repair is not cost-effective. Nonetheless, for small or moderate rockfalls, the damage may vary in function of the characteristics of the impacting rock blocks and of the buildings. The evaluation of the expected damage for masonry buildings, and for different small and moderate rockfall scenarios, is useful for assessing the expected direct loss at constructed areas, and its implications for life safety. A tool for the calculation of fragility curves for masonry buildings which are impacted by rock blocks is presented. The fragility curves provide the probability of exceeding a given damage state (low, moderate and high) for increasing impact energies of the rock blocks on the walls. The damage states are defined according to a damage index equal to the percentage of the damaged area of a wall, as being proportional to the repair cost. Aleatoric and epistemic uncertainties are incorporated with respect to the (i) rock block velocity, (ii) rock block size, (iii) masonry width, and (iv) masonry resistance. The calculation of the fragility curves is applied using a Monte Carlo simulation. Given user-defined data for the average value of these four parameters and their variability, random scenarios are developed, the respective damage index is assessed for each scenario, and the probability of exceedance of each damage state is calculated. For the assessment of the damage index, a database developed by the results of 576 analytical simulations is used. The variables range is: wall width 0.4 - 1.0 m, wall tensile strength 0.1 - 0.6 MPa, rock velocity 1-20 m/s, rock size 1-20 m3. Nonetheless this tool permits the use of alternative databases, on the condition that they contain data that correlate the damage with the four aforementioned variables. The fragility curves can be calculated using this tool either for single or for groups of buildings, as long as their characteristics are properly reflected in the variability of the input parameters. Selected examples of fragility curves sets are presented demonstrating the effect of the input parameters on the calculated probability of exceeding a given damage state, for different masonry typologies (stone and brick).
Yura, Harold T; Hanson, Steen G
2012-04-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.
Vibration fatigue using modal decomposition
NASA Astrophysics Data System (ADS)
Mršnik, Matjaž; Slavič, Janko; Boltežar, Miha
2018-01-01
Vibration-fatigue analysis deals with the material fatigue of flexible structures operating close to natural frequencies. Based on the uniaxial stress response, calculated in the frequency domain, the high-cycle fatigue model using the S-N curve material data and the Palmgren-Miner hypothesis of damage accumulation is applied. The multiaxial criterion is used to obtain the equivalent uniaxial stress response followed by the spectral moment approach to the cycle-amplitude probability density estimation. The vibration-fatigue analysis relates the fatigue analysis in the frequency domain to the structural dynamics. However, once the stress response within a node is obtained, the physical model of the structure dictating that response is discarded and does not propagate through the fatigue-analysis procedure. The structural model can be used to evaluate how specific dynamic properties (e.g., damping, modal shapes) affect the damage intensity. A new approach based on modal decomposition is presented in this research that directly links the fatigue-damage intensity with the dynamic properties of the system. It thus offers a valuable insight into how different modes of vibration contribute to the total damage to the material. A numerical study was performed showing good agreement between results obtained using the newly presented approach with those obtained using the classical method, especially with regards to the distribution of damage intensity and critical point location. The presented approach also offers orders of magnitude faster calculation in comparison with the conventional procedure. Furthermore, it can be applied in a straightforward way to strain experimental modal analysis results, taking advantage of experimentally measured strains.
The global impact distribution of Near-Earth objects
NASA Astrophysics Data System (ADS)
Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.
2016-02-01
Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.
Space Shuttle Debris Transport
NASA Technical Reports Server (NTRS)
Gomez, Reynaldo J., III
2010-01-01
This slide presentation reviews the assessment of debris damage to the Space Shuttle, and the use of computation to assist in the space shuttle applications. The presentation reviews the sources of debris, a mechanism for determining the probability of damaging debris impacting the shuttle, tools used, eliminating potential damaging debris sources, the use of computation to assess while inflight damage, and a chart showing the applications that have been used on increasingly powerful computers simulate the shuttle and the debris transport.
Effect of the state of internal boundaries on granite fracture nature under quasi-static compression
NASA Astrophysics Data System (ADS)
Damaskinskaya, E. E.; Panteleev, I. A.; Kadomtsev, A. G.; Naimark, O. B.
2017-05-01
Based on an analysis of the spatial distribution of hypocenters of acoustic emission signal sources and an analysis of the energy distributions of acoustic emission signals, the effect of the liquid phase and a weak electric field on the spatiotemporal nature of granite sample fracture is studied. Experiments on uniaxial compression of granite samples of natural moisture showed that the damage accumulation process is twostage: disperse accumulation of damages is followed by localized accumulation of damages in the formed macrofracture nucleus region. In energy distributions of acoustic emission signals, this transition is accompanied by a change in the distribution shape from exponential to power-law. Granite water saturation qualitatively changes the damage accumulation nature: the process is delocalized until macrofracture with the exponential energy distribution of acoustic emission signals. An exposure to a weak electric field results in a selective change in the damage accumulation nature in the sample volume.
Yi, Ren; Xiao-Ping, Gao; Hui, Liang
2015-03-27
Intracranial aneurysm, as a common cause of cerebral hemorrhage, is often discovered when the aneurysm ruptures, causing subarachnoid hemorrhage. Unfortunately, the formation of cerebral aneurysm, which is associated with endothelial damage and macrophage migration, still cannot be prevented now. Tight junctions (TJs) open due to the disappearance of TJ proteins occludin and zona occludens-1 (ZO-1) in damaged endothelia, thus allowing macrophage migration and forming cerebral aneurysm. Therefore, cerebral aneurysm formation can be prevented by increasing TJs of the artery endothelium. Interestingly, statin, which can reduce saccular aneurysm, may prevent aneurysm formation through acting on different steps, but the underlying mechanism remains unclear. In this study, angiotensin II (Ang II) significantly increased the permeability of human arterial endothelial cell (HAEC). Moreover, the distribution of ZO-1 in cell-cell junction area and the total expression in HAECs were significantly decreased by Ang II treatment. However, the abnormal distribution and decreased expression of ZO-1 and hyperpermeability of HAECs were significantly reversed by pretreatment with atorvastatin. Furthermore, Ang II-induced phosphorylations of MYPT1, LIMK and MLC2 were significantly inhibited with atorvastatin or Rho kinase (ROCK) inhibitor (H1152) pretreatment. Knockdown of ROCK-II probably abolished Ang II-induced abnormal ZO-1 distribution and expression deficiency and hyperpermeability of HAECs. In conclusion, atorvastatin prevented Ang II-induced rupture of HAEC monolayers by suppressing the ROCK signaling pathway. Our results may explain, at least in part, some beneficial effects of statins on cardiovascular diseases such as intracranial aneurysm. Copyright © 2015 Elsevier Inc. All rights reserved.
Abebe, Workineh; Collar, Concha; Ronda, Felicidad
2015-01-22
Tef grain is becoming very attractive in the Western countries since it is a gluten-free grain with appreciated nutritional advantages. However there is little information of its functional properties and starch digestibility and how they are affected by variety type and particle size distribution. This work evaluates the effect of the grain variety and the mill used on tef flour physico-chemical and functional properties, mainly derived from starch behavior. In vitro starch digestibility of the flours by Englyst method was assessed. Two types of mills were used to obtain whole flours of different granulation. Rice and wheat flours were analyzed as references. Protein molecular weight distribution and flour structure by SEM were also analyzed to justify some of the differences found among the cereals studied. Tef cultivar and mill type exhibited important effect on granulation, bulking density and starch damage, affecting the processing performance of the flours and determining the hydration and pasting properties. The color was darker although one of the white varieties had a lightness near the reference flours. Different granulation of tef flour induced different in vitro starch digestibility. The disc attrition mill led to higher starch digestibility rate index and rapidly available glucose, probably as consequence of a higher damaged starch content. The results confirm the adequacy of tef flour as ingredient in the formulation of new cereal based foods and the importance of the variety and the mill on its functional properties. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huerfano, V. A.; Cua, G.; von Hillebrandt, C.; Saffar, A.
2007-12-01
The island of Puerto Rico has a long history of damaging earthquakes. Major earthquakes from off-shore sources have affected Puerto Rico in 1520, 1615, 1670, 1751, 1787, 1867, and 1918 (Mueller et al, 2003; PRSN Catalogue). Recent trenching has also yielded evidence of possible M7.0 events inland (Prentice, 2000). The high seismic hazard, large population, high tsunami potential and relatively poor construction practice can result in a potentially devastating combination. Efficient emergency response in event of a large earthquake will be crucial to minimizing the loss of life and disruption of lifeline systems in Puerto Rico. The ShakeMap system (Wald et al, 2004) developed by the USGS to rapidly display and disseminate information about the geographical distribution of ground shaking (and hence potential damage) following a large earthquake has proven to be a vital tool for post earthquake emergency response efforts, and is being adopted/emulated in various seismically active regions worldwide. Implementing a robust ShakeMap system is among the top priorities of the Puerto Rico Seismic Network. However, the ultimate effectiveness of ShakeMap in post- earthquake response depends not only on its rapid availability, but also on the effective use of the information it provides. We developed ShakeMap scenarios of a suite of damaging historical and probable earthquakes that severely impact San Juan, Ponce, and Mayagüez, the 3 largest cities in Puerto Rico. Earthquake source parameters were obtained from McCann and Mercado (1998); and Huérfano (2004). For historical earthquakes that generated tsunamis, tsunami inundation maps were generated using the TIME method (Shuto, 1991). The ShakeMap ground shaking maps were presented to local and regional governmental and emergency response agencies at the 2007 Annual conference of the Puerto Rico Emergency Management and Disaster Administration in San Juan, PR, and at numerous other emergency management talks and training sessions. Economic losses are estimated using the ShakeMap scenario ground motions (Saffar, 2007). The calibration tasks necessary in generating these scenarios (developing Vs30 maps, attenuation relationships) complement the on-going efforts of the Puerto Rico Seismic Network to generate ShakeMaps in real-time.
Methodology of Historical Flood Evaluation from Korean Historical Documents during AD 1392 to 1910
NASA Astrophysics Data System (ADS)
Cho, H. B.; Kim, H.; Noh, S.; Jang, C.
2007-12-01
Study on extreme flood events has critical limitation of shortage of historical data because modern systematic data don't implement long time series. The historical documentary records hence can be one of the important sources to contribute additional information on extreme flood events which had occurred before the instrumental observations began. For the proper data mining, documentary records satisfying following four conditions are preferred. 1. Long enough time series, 2. Official archives covering over all Korean peninsular, 3. Abundant enough record number, and 4. Detailed damage description. The Annals of Choson Dynasty includes about 500 years and 511 number of flood records during Choson Dynasty in ancient Korea. According to the annals, there were highly dense flood damage records in the middle of 17th century and the largest human damage and residence damage occurred in 1739 and 1856 respectively. Another source is Jeungbo-Munheonbigo. Jeungbo-Munheonbigo is a taxonomic document categorized by the themes such as cultures, social systems, and climates as well as contains 79 number of flood damage records. An effective way to analyze those historical floods without water level data is to classify and categorize the flood damage records because all records are written in descriptive way. Consequently, 556 records are categorized into 10 items by flood damage types and each categorized record is classified into three grades by numerical level that is how much the record is expressed in numerical way. These grouping results are applied to decide reasonable period range to get detailed information from entire inspection period. In addition, Historical Flood Evaluation Index (HFEI) thereby can be derived from the processes in quantitative and statistical ways to evaluate the magnitude of each ancient flood. In this research, flood damage evaluation is mainly focused on the damage of human beings and residences. Also degree ranges based on cumulative probability are induced with two damage inventory. HFEI by conditional weighted factors is applied to every flood record and to analysis for flood distribution in annual series.
NASA Astrophysics Data System (ADS)
Suppasri, A.; Charvet, I.; Leelawat, N.; Fukutani, Y.; Muhari, A.; Futami, T.; Imamura, F.
2014-12-01
This study focused in turn on detailed data of buildings and boats damage caused by the 2011 tsunami in order to understand its main causes and provide damage probability estimates. Tsunami-induced building damage data was collected from field surveys, and includes inundation depth, building material, number of stories and occupancy type for more than 80,000 buildings. Numerical simulations with high resolution bathymetry and topography data were conducted to obtain characteristic tsunami measures such as flow velocity. These data were analyzed using advanced statistical methods, ordinal regression analysis to create not only empirical 2D tsunami fragility curves, but also 3D tsunami fragility surfaces for the first time. The effect of floating debris was also considered, by using a binary indicator of debris impact based on the proximity of a structure from a debris source (i.e. washed away building). Both the 2D and 3D fragility analyses provided results for each different building damage level, and different topography. While 2D fragility curves provide easily interpretable results relating tsunami flow depth to damage probability for different damage levels, 3D fragility surfaces allow for several influential tsunami parameters to be taken into account thus reduce uncertainty in the probability estimations. More than 20,000 damaged boats were used in the analysis similar to the one carried out on the buildings. Detailed data for each boat comprises information on the damage ratio (paid value over insured value), tonnage, engine type, material type and damage classification. The 2D and 3D fragility analyses were developed using representative tsunami heights for each port obtained from field surveys and flow velocities obtained from the aforementioned simulations. The results are currently being adapted for practical disaster mitigation. They are being integrated with the probabilistic tsunami hazard analysis, in order to create offshore and onshore probabilistic hazard maps. Through the GPS and embedded calculation function based on the aforementioned fragility results, these applications can be used in the field for a quick estimation of possible building damage, as well as a decision support system for fishermen (whether or not they should move their boats to the deep sea upon tsunami arrival).
HLA-matched sibling bone marrow transplantation for β-thalassemia major
Sabloff, Mitchell; Chandy, Mammen; Wang, Zhiwei; Logan, Brent R.; Ghavamzadeh, Ardeshir; Li, Chi-Kong; Irfan, Syed Mohammad; Bredeson, Christopher N.; Cowan, Morton J.; Gale, Robert Peter; Hale, Gregory A.; Horan, John; Hongeng, Suradej; Eapen, Mary
2011-01-01
We describe outcomes after human leukocyte antigen-matched sibling bone marrow transplantation (BMT) for 179 patients with β-thalassemia major. The median age at transplantation was 7 years and the median follow-up was 6 years. The distribution of Pesaro risk class I, II, and III categories was 2%, 42%, and 36%, respectively. The day 30 cumulative incidence of neutrophil recovery and day 100 platelet recovery were 90% and 86%, respectively. Seventeen patients had graft failure, which was fatal in 11. Six of 9 patients with graft failure are alive after a second transplantation. The day 100 probability of acute graft-versus-host disease and 5-year probability of chronic graft-versus-host disease was 38% and 13%, respectively. The 5-year probabilities of overall- and disease-free survival were 91% and 88%, respectively, for patients with Pesaro risk class II, and 64% and 62%, respectively, for Pesaro risk class III. In multivariate analysis, mortality risks were higher in patients 7 years of age and older and those with hepatomegaly before BMT. The leading causes of death were interstitial pneumonitis (n = 7), hemorrhage (n = 8), and veno-occlusive disease (n = 6). Proceeding to BMT in children younger than 7 years before development of end-organ damage, particularly in the liver, should improve results after BMT for β-thalassemia major. PMID:21119108
Survival of Norway spruce remains higher in mixed stands under a dryer and warmer climate.
Neuner, Susanne; Albrecht, Axel; Cullmann, Dominik; Engels, Friedrich; Griess, Verena C; Hahn, W Andreas; Hanewinkel, Marc; Härtl, Fabian; Kölling, Christian; Staupendahl, Kai; Knoke, Thomas
2015-02-01
Shifts in tree species distributions caused by climatic change are expected to cause severe losses in the economic value of European forestland. However, this projection disregards potential adaptation options such as tree species conversion, shorter production periods, or establishment of mixed species forests. The effect of tree species mixture has, as yet, not been quantitatively investigated for its potential to mitigate future increases in production risks. For the first time, we use survival time analysis to assess the effects of climate, species mixture and soil condition on survival probabilities for Norway spruce and European beech. Accelerated Failure Time (AFT) models based on an extensive dataset of almost 65,000 trees from the European Forest Damage Survey (FDS)--part of the European-wide Level I monitoring network--predicted a 24% decrease in survival probability for Norway spruce in pure stands at age 120 when unfavorable changes in climate conditions were assumed. Increasing species admixture greatly reduced the negative effects of unfavorable climate conditions, resulting in a decline in survival probabilities of only 7%. We conclude that future studies of forest management under climate change as well as forest policy measures need to take this, as yet unconsidered, strongly advantageous effect of tree species mixture into account. © 2014 John Wiley & Sons Ltd.
Ulusoy, Nuran
2017-01-01
The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM) materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP). Two types of designs without and with 3 mm intraradicular extensions, endocrown (E) and modified endocrown (ME), were modeled on a 3D Finite element (FE) model of the MFP. Vitablocks Mark II (VMII), Vita Enamic (VE), and Lava Ultimate (LU) CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU. PMID:29119108
Nonadditive entropies yield probability distributions with biases not warranted by the data.
Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A
2013-11-01
Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.
Bundled automobile insurance coverage and accidents.
Li, Chu-Shiu; Liu, Chwen-Chi; Peng, Sheng-Chang
2013-01-01
This paper investigates the characteristics of automobile accidents by taking into account two types of automobile insurance coverage: comprehensive vehicle physical damage insurance and voluntary third-party liability insurance. By using a unique data set in the Taiwanese automobile insurance market, we explore the bundled automobile insurance coverage and the occurrence of claims. It is shown that vehicle physical damage insurance is the major automobile coverage and affects the decision to purchase voluntary liability insurance coverage as a complement. Moreover, policyholders with high vehicle physical damage insurance coverage have a significantly higher probability of filing vehicle damage claims, and if they additionally purchase low voluntary liability insurance coverage, their accident claims probability is higher than those who purchase high voluntary liability insurance coverage. Our empirical results reveal that additional automobile insurance coverage information can capture more driver characteristics and driving behaviors to provide useful information for insurers' underwriting policies and to help analyze the occurrence of automobile accidents. Copyright © 2012 Elsevier Ltd. All rights reserved.
DNA damage during glycation of lysine by methylglyoxal: assessment of vitamins in preventing damage.
Suji, G; Sivakami, S
2007-11-01
Amino acids react with methylglyoxal to form advanced glycation end products. This reaction is known to produce free radicals. In this study, cleavage to plasmid DNA was induced by the glycation of lysine with methylglyoxal in the presence of iron(III). This system was found to produce superoxide as well as hydroxyl radicals. The abilities of various vitamins to prevent damage to plasmid DNA were evaluated. Pyridoxal-5-phosphate showed maximum protection, while pyridoxamine showed no protection. The protective abilities could be directly correlated to inhibition of production of hydroxyl and superoxide radicals. Pyridoxal-5-phosphate exhibited low radical scavenging ability as evaluated by its TEAC, but showed maximum protection probably by interfering in free radical production. Pyridoxamine did not inhibit free radical production. Thiamine and thiamine pyrophosphate, both showed protective effects albeit to different extents. Tetrahydrofolic acid showed better antioxidant activity than folic acid but was found to damage DNA by itself probably by superoxide generation.
Gomez, Céline; Mangeas, Morgan; Curt, Thomas; Ibanez, Thomas; Munzinger, Jérôme; Dumas, Pascal; Jérémy, André; Despinoy, Marc; Hély, Christelle
2015-01-01
Wildfire has been recognized as one of the most ubiquitous disturbance agents to impact on natural environments. In this study, our main objective was to propose a modeling approach to investigate the potential impact of wildfire on biodiversity. The method is illustrated with an application example in New Caledonia where conservation and sustainable biodiversity management represent an important challenge. Firstly, a biodiversity loss index, including the diversity and the vulnerability indexes, was calculated for every vegetation unit in New Caledonia and mapped according to its distribution over the New Caledonian mainland. Then, based on spatially explicit fire behavior simulations (using the FLAMMAP software) and fire ignition probabilities, two original fire risk assessment approaches were proposed: a one-off event model and a multi-event burn probability model. The spatial distribution of fire risk across New Caledonia was similar for both indices with very small localized spots having high risk. The patterns relating to highest risk are all located around the remaining sclerophyll forest fragments and are representing 0.012% of the mainland surface. A small part of maquis and areas adjacent to dense humid forest on ultramafic substrates should also be monitored. Vegetation interfaces between secondary and primary units displayed high risk and should represent priority zones for fire effects mitigation. Low fire ignition probability in anthropogenic-free areas decreases drastically the risk. A one-off event associated risk allowed localizing of the most likely ignition areas with potential for extensive damage. Emergency actions could aim limiting specific fire spread known to have high impact or consist of on targeting high risk areas to limit one-off fire ignitions. Spatially explicit information on burning probability is necessary for setting strategic fire and fuel management planning. Both risk indices provide clues to preserve New Caledonia hot spot of biodiversity facing wildfires.
Gomez, Céline; Mangeas, Morgan; Curt, Thomas; Ibanez, Thomas; Munzinger, Jérôme; Dumas, Pascal; Jérémy, André; Despinoy, Marc; Hély, Christelle
2015-01-01
Wildfire has been recognized as one of the most ubiquitous disturbance agents to impact on natural environments. In this study, our main objective was to propose a modeling approach to investigate the potential impact of wildfire on biodiversity. The method is illustrated with an application example in New Caledonia where conservation and sustainable biodiversity management represent an important challenge. Firstly, a biodiversity loss index, including the diversity and the vulnerability indexes, was calculated for every vegetation unit in New Caledonia and mapped according to its distribution over the New Caledonian mainland. Then, based on spatially explicit fire behavior simulations (using the FLAMMAP software) and fire ignition probabilities, two original fire risk assessment approaches were proposed: a one-off event model and a multi-event burn probability model. The spatial distribution of fire risk across New Caledonia was similar for both indices with very small localized spots having high risk. The patterns relating to highest risk are all located around the remaining sclerophyll forest fragments and are representing 0.012% of the mainland surface. A small part of maquis and areas adjacent to dense humid forest on ultramafic substrates should also be monitored. Vegetation interfaces between secondary and primary units displayed high risk and should represent priority zones for fire effects mitigation. Low fire ignition probability in anthropogenic-free areas decreases drastically the risk. A one-off event associated risk allowed localizing of the most likely ignition areas with potential for extensive damage. Emergency actions could aim limiting specific fire spread known to have high impact or consist of on targeting high risk areas to limit one-off fire ignitions. Spatially explicit information on burning probability is necessary for setting strategic fire and fuel management planning. Both risk indices provide clues to preserve New Caledonia hot spot of biodiversity facing wildfires. PMID:25691965
ProbOnto: ontology and knowledge base of probability distributions.
Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala
2016-09-01
Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta
2017-02-15
Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.
Estimation of probability of failure for damage-tolerant aerospace structures
NASA Astrophysics Data System (ADS)
Halbert, Keith
The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This dissertation describes and develops new PDTA methodologies that directly address the deficiencies of the currently used tools. The new methods are implemented as a free, publicly licensed and open source R software package that can be downloaded from the Comprehensive R Archive Network. The tools consist of two main components. First, an explicit (and expensive) Monte Carlo approach is presented which simulates the life of an aircraft structural component flight-by-flight. This straightforward MC routine can be used to provide defensible estimates of the failure probabilities for future flights and repair probabilities for future inspections under a variety of failure and maintenance scenarios. This routine is intended to provide baseline estimates against which to compare the results of other, more efficient approaches. Second, an original approach is described which models the fatigue process and future scheduled inspections as a hidden Markov model. This model is solved using a particle-based approximation and the sequential importance sampling algorithm, which provides an efficient solution to the PDTA problem. Sequential importance sampling is an extension of importance sampling to a Markov process, allowing for efficient Bayesian updating of model parameters. This model updating capability, the benefit of which is demonstrated, is lacking in other PDTA approaches. The results of this approach are shown to agree with the results of the explicit Monte Carlo routine for a number of PDTA problems. Extensions to the typical PDTA problem, which cannot be solved using currently available tools, are presented and solved in this work. These extensions include incorporating observed evidence (such as non-destructive inspection results), more realistic treatment of possible future repairs, and the modeling of failure involving more than one crack (the so-called continuing damage problem). The described hidden Markov model / sequential importance sampling approach to PDTA has the potential to improve aerospace structural safety and reduce maintenance costs by providing a more accurate assessment of the risk of failure and the likelihood of repairs throughout the life of an aircraft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
Public Risk Assessment Program
NASA Technical Reports Server (NTRS)
Mendeck, Gavin
2010-01-01
The Public Entry Risk Assessment (PERA) program addresses risk to the public from shuttle or other spacecraft re-entry trajectories. Managing public risk to acceptable levels is a major component of safe spacecraft operation. PERA is given scenario inputs of vehicle trajectory, probability of failure along that trajectory, the resulting debris characteristics, and field size and distribution, and returns risk metrics that quantify the individual and collective risk posed by that scenario. Due to the large volume of data required to perform such a risk analysis, PERA was designed to streamline the analysis process by using innovative mathematical analysis of the risk assessment equations. Real-time analysis in the event of a shuttle contingency operation, such as damage to the Orbiter, is possible because PERA allows for a change to the probability of failure models, therefore providing a much quicker estimation of public risk. PERA also provides the ability to generate movie files showing how the entry risk changes as the entry develops. PERA was designed to streamline the computation of the enormous amounts of data needed for this type of risk assessment by using an average distribution of debris on the ground, rather than pinpointing the impact point of every piece of debris. This has reduced the amount of computational time significantly without reducing the accuracy of the results. PERA was written in MATLAB; a compiled version can run from a DOS or UNIX prompt.
ZERODUR - bending strength: review of achievements
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2017-08-01
Increased demand for using the glass ceramic ZERODUR® with high mechanical loads called for strength data based on larger statistical samples. Design calculations for failure probability target value below 1: 100 000 cannot be made reliable with parameters derived from 20 specimen samples. The data now available for a variety of surface conditions, ground with different grain sizes and acid etched for full micro crack removal, allow stresses by factors four to ten times higher than before. The large sample revealed that breakage stresses of ground surfaces follow the three parameter Weibull distribution instead of the two parameter version. This is more reasonable considering that the micro cracks of such surfaces have a maximum depth which is reflected in the existence of a threshold breakage stress below which breakage probability is zero. This minimum strength allows calculating minimum lifetimes. Fatigue under load can be taken into account by using the stress corrosion coefficient for the actual environmental humidity. For fully etched surfaces Weibull statistics fails. The precondition of the Weibull distribution, the existence of one unique failure mechanism, is not given anymore. ZERODUR® with fully etched surfaces free from damages introduced after etching endures easily 100 MPa tensile stress. The possibility to use ZERODUR® for combined high precision and high stress application was confirmed by the successful launch and continuing operation of LISA Pathfinder the precursor experiment for the gravitational wave antenna satellite array eLISA.
NASA Technical Reports Server (NTRS)
Fielding, Eric J.; Talebian, M.; Rosen, P. A.; Nazari, H.; Jackson, J. A.; Ghorashi, M.; Walker, R.
2005-01-01
We use the interferometric correlation from Envisat synthetic aperture radar (SAR) images to map the details of the surface ruptures related to the 26 December 2003 earthquake that devastated Bam, Iran. The main strike-slip fault rupture south of the city of Bam has a series of four segments with left steps shown by a narrow line of low correlation in the coseismic interferogram. This also has a clear expression in the field because of the net extension across the fault. Just south of the city limits, the surface strain becomes distributed over a width of about 500 m, probably because of a thicker layer of soft sedimentary material.
Wooden beverage cases cause little damage to bottle caps
R. Bruce Anderson; William C. Miller
1973-01-01
Wooden beverage cases cause little damage to aluminum resealable caps during distribution. A study at bottling plants and distribution warehouses showed that an average of 1 bottle out of 4,000 has cap damage. Most of the damage was attributed to handling at the warehouse and in transit. Some recommendations are given for improvement of wooden beverage cases to prevent...
NASA Astrophysics Data System (ADS)
Zonta, Daniele; Pozzi, Matteo; Wu, Huayong; Inaudi, Daniele
2008-03-01
This paper introduces a concept of smart structural elements for the real-time condition monitoring of bridges. These are prefabricated reinforced concrete elements embedding a permanent sensing system and capable of self-diagnosis when in operation. The real-time assessment is automatically controlled by a numerical algorithm founded on Bayesian logic: the method assigns a probability to each possible damage scenario, and estimates the statistical distribution of the damage parameters involved (such as location and extent). To verify the effectiveness of the technology, we produced and tested in the laboratory a reduced-scale smart beam prototype. The specimen is 3.8 m long and has cross-section 0.3 by 0.5m, and has been prestressed using a Dywidag bar, in such a way as to control the preload level. The sensor system includes a multiplexed version of SOFO interferometric sensors mounted on a composite bar, along with a number of traditional metal-foil strain gauges. The method allowed clear recognition of increasing fault states, simulated on the beam by gradually reducing the prestress level.
Leite, Gabriel Adan Araújo; Figueiredo, Thamiris Moreira; Sanabria, Marciana; Dias, Ana Flávia Mota Gonçalves; Silva, Patrícia Villela E; Martins Junior, Airton da Cunha; Barbosa Junior, Fernando; Kempinas, Wilma De Grava
2017-10-01
Dyslipidemias are occurring earlier in the population due to the increase of obesity and bad eating habits. Rosuvastatin inhibits the enzyme HMG-CoA reductase, decreasing total cholesterol. Ascorbic acid is an important antioxidant compound for male reproductive system. This study aimed to evaluate whether ascorbic acid supplementation may prevent the reproductive damage provoked by rosuvastatin administration at prepuberty. Male pups were distributed into six experimental groups that received saline solution 0.9%, 3 or 10mg/kg/day of rosuvastatin, 150mg/day of ascorbic acid, or 150mg/day of ascorbic acid associated with 3 or 10mg/kg/day of rosuvastatin from post-natal day (PND) 23 until PND53. Rosuvastatin-treated groups showed delayed puberty installation, androgen depletion and impairment on testicular and epididymal morphology. Ascorbic acid partially prevented these reproductive damages. In conclusion, rosuvastatin exposure is a probable risk to reproductive development and ascorbic acid supplementation may be useful to prevent the reproductive impairment of rosuvastatin exposure. Copyright © 2017 Elsevier Inc. All rights reserved.
Incorporating Skew into RMS Surface Roughness Probability Distribution
NASA Technical Reports Server (NTRS)
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin
NASA Astrophysics Data System (ADS)
zhangli, Sun; xiufang, Zhu; yaozhong, Pan
2016-04-01
Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.
2016-03-31
fiber distributions. Task 2.1 is concerned with damage evolution in a peridynamic model of poroelastic materials. Initial results for both tasks are...distributions. Task 2.1 is concerned with damage evolution in a peridynamic model of poroelastic materials. Initial results for both tasks are reported and...Task 2.1: Damage evolution in a peridynamic model of poroelastic materials. Background and Motivation In order to model the presence of pores and
The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions
Larget, Bret
2013-01-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066
NASA Astrophysics Data System (ADS)
Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw
2011-07-01
Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.
Percolation of localized attack on isolated and interdependent random networks
NASA Astrophysics Data System (ADS)
Shao, Shuai; Huang, Xuqing; Stanley, H. Eugene; Havlin, Shlomo
2014-03-01
Percolation properties of isolated and interdependent random networks have been investigated extensively. The focus of these studies has been on random attacks where each node in network is attacked with the same probability or targeted attack where each node is attacked with a probability being a function of its centrality, such as degree. Here we discuss a new type of realistic attacks which we call a localized attack where a group of neighboring nodes in the networks are attacked. We attack a randomly chosen node, its neighbors, and its neighbor of neighbors and so on, until removing a fraction (1 - p) of the network. This type of attack reflects damages due to localized disasters, such as earthquakes, floods and war zones in real-world networks. We study, both analytically and by simulations the impact of localized attack on percolation properties of random networks with arbitrary degree distributions and discuss in detail random regular (RR) networks, Erdős-Rényi (ER) networks and scale-free (SF) networks. We extend and generalize our theoretical and simulation results of single isolated networks to networks formed of interdependent networks.
Toda, S.; Lin, J.; Meghraoui, M.; Stein, R.S.
2008-01-01
The Wenchuan earthquake on the Longmen Shan fault zone devastated cities of Sichuan, claiming at least 69,000 lives. We calculate that the earthquake also brought the Xianshuihe, Kunlun and Min Jiang faults 150-400 km from the mainshock rupture in the eastern Tibetan Plateau 0.2-0.5 bars closer to Coulomb failure. Because some portions of these stressed faults have not ruptured in more than a century, the earthquake could trigger or hasten additional M > 7 earthquakes, potentially subjecting regions from Kangding to Daofu and Maqin to Rangtag to strong shaking. We use the calculated stress changes and the observed background seismicity to forecast the rate and distribution of damaging shocks. The earthquake probability in the region is estimated to be 57-71% for M ??? 6 shocks during the next decade, and 8-12% for M ??? 7 shocks. These are up to twice the probabilities for the decade before the Wenchuan earthquake struck. Copyright 2008 by the American Geophysical Union.
Resilient Grid Operational Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqualini, Donatella
Extreme weather-related disturbances, such as hurricanes, are a leading cause of grid outages historically. Although physical asset hardening is perhaps the most common way to mitigate the impacts of severe weather, operational strategies may be deployed to limit the extent of societal and economic losses associated with weather-related physical damage.1 The purpose of this study is to examine bulk power-system operational strategies that can be deployed to mitigate the impact of severe weather disruptions caused by hurricanes, thereby increasing grid resilience to maintain continuity of critical infrastructure during extreme weather. To estimate the impacts of resilient grid operational strategies, Losmore » Alamos National Laboratory (LANL) developed a framework for hurricane probabilistic risk analysis (PRA). The probabilistic nature of this framework allows us to estimate the probability distribution of likely impacts, as opposed to the worst-case impacts. The project scope does not include strategies that are not operations related, such as transmission system hardening (e.g., undergrounding, transmission tower reinforcement and substation flood protection) and solutions in the distribution network.« less
[Assessment of biological corrosion of ferroconcrete of ground-based industrial structures].
Rozhanskaia, A M; Piliashenko-Novokhatnyĭ, A I; Purish, L M; Durcheva, V N; Kozlova, I A
2001-01-01
One of the objects of a nuclear plant built in 1983 and put in 15-years long dead storage with the purpose to estimate the degree of contamination by rust-hazardous microorganisms of ferroconcrete structures and to predict their biocorrosion state after putting in operation was a subject of microbiological investigation. The everywhere distribution of sulphur cycle bacteria (thionic and sulphate-reducing bacteria) on the surface and in the bulk of concrete structures, their confineness to corrosion products of concrete and bars of the investigated building have been shown. It has been demonstrated that sulphate-reducing bacteria were the most distributed group in all the sampling points. An indirect estimation of participation degree of the microbial communities in the processes of ferroconcrete biological damages has been carried out as based on the accumulation intensity of aggressive gaseous metabolites--carbon dioxide and hydrogen. Probability of deterioration of biocorrosion situation under the full-scale operation of the object has been substantiated.
NASA Technical Reports Server (NTRS)
Spisz, E. W.; Bowman, R. L.; Jack, J. R.
1973-01-01
The data obtained from two recent experiments conducted in a continuing series of experiments at the Lewis Research Center into the contamination characteristics of a 5-pound thrust MMH/N2O4 engine are presented. The primary objectives of these experiments were to establish the angular distribution of condensible exhaust products within the plume and the corresponding optical damage angular distribution of transmitting optical elements attributable to this contaminant. The plume mass flow distribution was measured by five quartz crystal microbalances (QCM's) located at the engine axis evaluation. The fifth QCM was located above the engine and 15 deg behind the nozzle exit plane. The optical damage was determined by ex-situ transmittance measurements for the wavelength range from 0.2 to 0.6 microns on 2.54 cm diameter fused silica discs also located at engine centerline elevation. Both the mass deposition and optical damage angular distributions followed the expected trend of decreasing deposition and damage as the angle between sensor or sample and the nozzle axis increased. A simple plume gas flow equation predicted the deposition distribution reasonably well for angles of up to 55 degrees. The optical damage measurements also indicated significant effects at large angles.
Repetitive pulses and laser-induced retinal injury thresholds
NASA Astrophysics Data System (ADS)
Lund, David J.
2007-02-01
Experimental studies with repetitively pulsed lasers show that the ED 50, expressed as energy per pulse, varies as the inverse fourth power of the number of pulses in the exposure, relatively independently of the wavelength, pulse duration, or pulse repetition frequency of the laser. Models based on a thermal damage mechanism cannot readily explain this result. Menendez et al. proposed a probability-summation model for predicting the threshold for a train of pulses based on the probit statistics for a single pulse. The model assumed that each pulse is an independent trial, unaffected by any other pulse in the train of pulses and assumes that the probability of damage for a single pulse is adequately described by the logistic curve. The requirement that the effect of each pulse in the pulse train be unaffected by the effects of other pulses in the train is a showstopper when the end effect is viewed as a thermal effect with each pulse in the train contributing to the end temperature of the target tissue. There is evidence that the induction of cell death by microcavitation bubbles around melanin granules heated by incident laser irradiation can satisfy the condition of pulse independence as required by the probability summation model. This paper will summarize the experimental data and discuss the relevance of the probability summation model given microcavitation as a damage mechanism.
NASA Astrophysics Data System (ADS)
Naso, Susanna; Chen, Albert S.; Djordjević, Slobodan; Aronica, Giuseppe T.
2015-04-01
The classical approach to flood defence, aimed at reducing the probability of flooding through hard defences, has been substituted by flood risk management approach which accepts the idea of coping with floods and aims at reducing not only the probability of flooding, but also the consequences. In this view, the concept of vulnerability becomes central, such as the (non-structural) measures for its increment. On 22 November 2011, an exceptional rainstorm hit the Longano catchment (North-East part of Sicily, Italy) producing local heavy rainfall, mud-debris flow and flash flooding. The flash flood involved property, buildings, roads and more than 100 commercial estates have suffered severe damages. Some days after the event, the municipality provided people forms to describe the damages that occurred on their properties. Unfortunately, the lack of common guidelines in compiling them, their coarseness and the impossibility to have monetary information on them (such us damage data from previous events), did not allow the implementation of a detailed damage analysis. What we're developing in this work is a method for a qualitative evaluation of the consequences of floods, based on vulnerability curves for structures and classes of entities at risk. The difficulty in deriving the vulnerability curves for different building typologies, as function of the water depth, was due to the lack of quantitative information both on damages caused by previous events and on buildings' value. To solve the problem we submitted a questionnaire to a team of experts asking for an estimation of building damages to different hypothetical inundation depths. What we wanted to obtain was deriving the vulnerability data from technicians' experience, believing in the fundamental importance of the collaboration among research and professional engineers. Through the elaboration and the synthesis of the experts' estimations we derived the vulnerability curves for different building typologies and for inundations of both short and long duration. At the same time we defined the classes of the variable Entity in function of both buildings' asset value and their importance for society. Once the buildings of different typologies are grouped, a GIS-based tool (using hazard information obtained from hydraulic modelling, building parcels, vulnerability curves and entity classes) is used to collocate each element at risk inside an Entity-Vulnerability matrix. The construction of a E-V matrix allow both to understand the actual situation of flood-prone area (and the possible consequences of a flood event) and to study the effectiveness of non-structural measures, just studying how their implementation modifies the distribution of elements at risk inside it. The proposed approach can be useful for authorities responsible for development and periodical review of adaptive flood risk management plans.
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
Integrated-Circuit Pseudorandom-Number Generator
NASA Technical Reports Server (NTRS)
Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur
1992-01-01
Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.
NASA Astrophysics Data System (ADS)
Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida
2016-10-01
The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and bivariate analyses were applied to better characterize each vulnerability parameter. Multiple corresponding analyses revealed strong relationships between the "Distance to channel or bridges", "Structural building type", "Building footprint" and the observed damage. Logistic regression enabled quantification of the contribution of each explanatory parameter to potential damage, and determination of the significant parameters that express the damage susceptibility of a building. The model was applied 200 times on different calibration and validation data sets in order to examine performance. Results show that 90% of these tests have a success rate of more than 67%. Probabilities (at building scale) of experiencing different damage levels during a future event similar to the 8 February 2013 flash flood are the major outcomes of this study.
DOT National Transportation Integrated Search
2016-07-31
This report presents a novel framework for promptly assessing the probability of barge-bridge : collision damage of piers based on probabilistic-based classification through machine learning. The main : idea of the presented framework is to divide th...
USDA-ARS?s Scientific Manuscript database
The most damaging foliar disease of pyrethrum in Australia is ray blight caused by Stagonosporopsis tanaceti. The probability of growers incurring economic losses caused by this disease has been substantially reduced by the implementation of a prophylactically-applied spring fungicide program. Th...
77 FR 37344 - Airworthiness Directives; Airbus Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
... reports of extensive damage to engine air intake cowls as a result of acoustic panel collapse. This proposed AD would require repetitive inspections of the three inner acoustic panels of both engine air... finding extensive damage to engine air intake cowls as a result of acoustic panel collapse, most probably...
Sulfur Dioxide and Material Damage
ERIC Educational Resources Information Center
Gillette, Donald G.
1975-01-01
This study relates sulfur dioxide levels with material damage in heavily populated or polluted areas. Estimates of loss were determined from increased maintenance and replacement costs. The data indicate a decrease in losses during the past five years probably due to decline in pollution levels established by air quality standards. (MR)
Bivariate normal, conditional and rectangular probabilities: A computer program with applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.
1980-01-01
Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.
Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast
Geist, E.L.; Parsons, T.
2009-01-01
Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.
Optimal sensor placement for active guided wave interrogation of complex metallic components
NASA Astrophysics Data System (ADS)
Coelho, Clyde K.; Kim, Seung Bum; Chattopadhyay, Aditi
2011-04-01
With research in structural health monitoring (SHM) moving towards increasingly complex structures for damage interrogation, the placement of sensors is becoming a key issue in the performance of the damage detection methodologies. For ultrasonic wave based approaches, this is especially important because of the sensitivity of the travelling Lamb waves to material properties, geometry and boundary conditions that may obscure the presence of damage if they are not taken into account during sensor placement. The framework proposed in this paper defines a sensing region for a pair of piezoelectric transducers in a pitch-catch damage detection approach by taking into account the material attenuation and probability of false alarm. Using information about the region interrogated by a sensoractuator pair, a simulated annealing optimization framework was implemented in order to place sensors on complex metallic geometries such that a selected minimum damage type and size could be detected with an acceptable probability of false alarm anywhere on the structure. This approach was demonstrated on a lug joint to detect a crack and on a large Naval SHM test bed and resulted in a placement of sensors that was able to interrogate all parts of the structure using the minimum number of transducers.
NASA Astrophysics Data System (ADS)
Minakuchi, Shu; Tsukamoto, Haruka; Takeda, Nobuo
2009-03-01
This study proposes novel hierarchical sensing concept for detecting damages in composite structures. In the hierarchical system, numerous three-dimensionally structured sensor devices are distributed throughout the whole structural area and connected with the optical fiber network through transducing mechanisms. The distributed "sensory nerve cell" devices detect the damage, and the fiber optic "spinal cord" network gathers damage signals and transmits the information to a measuring instrument. This study began by discussing the basic concept of the hierarchical sensing system thorough comparison with existing fiber optic based systems and nerve systems in the animal kingdom. Then, in order to validate the proposed sensing concept, impact damage detection system for the composite structure was proposed. The sensor devices were developed based on Comparative Vacuum Monitoring (CVM) system and the Brillouin based distributed strain sensing was utilized to gather the damage signals from the distributed devices. Finally a verification test was conducted using prototype devices. Occurrence of barely visible impact damage was successfully detected and it was clearly indicated that the hierarchical system has better repairability, higher robustness, and wider monitorable area compared to existing systems utilizing embedded optical fiber sensors.
Statistical Models of Fracture Relevant to Nuclear-Grade Graphite: Review and Recommendations
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bratton, Robert L.
2011-01-01
The nuclear-grade (low-impurity) graphite needed for the fuel element and moderator material for next-generation (Gen IV) reactors displays large scatter in strength and a nonlinear stress-strain response from damage accumulation. This response can be characterized as quasi-brittle. In this expanded review, relevant statistical failure models for various brittle and quasi-brittle material systems are discussed with regard to strength distribution, size effect, multiaxial strength, and damage accumulation. This includes descriptions of the Weibull, Batdorf, and Burchell models as well as models that describe the strength response of composite materials, which involves distributed damage. Results from lattice simulations are included for a physics-based description of material breakdown. Consideration is given to the predicted transition between brittle and quasi-brittle damage behavior versus the density of damage (level of disorder) within the material system. The literature indicates that weakest-link-based failure modeling approaches appear to be reasonably robust in that they can be applied to materials that display distributed damage, provided that the level of disorder in the material is not too large. The Weibull distribution is argued to be the most appropriate statistical distribution to model the stochastic-strength response of graphite.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.
In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less
Positive phase space distributions and uncertainty relations
NASA Technical Reports Server (NTRS)
Kruger, Jan
1993-01-01
In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.
So, Emily; Spence, Robin
2013-01-01
Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.
Andersen, D.C.; Shafroth, P.B.
2010-01-01
Beaver convert lotic stream habitat to lentic through dam construction, and the process is reversed when a flood or other event causes dam failure. We investigated both processes on a regulated Sonoran Desert stream, using the criterion that average current velocity is < 0.2 m s-1 in a lentic reach. We estimated temporal change in the lotic:lentic stream length ratio by relating beaver pond length (determined by the upstream lentic-lotic boundary position) to dam size, and coupling that to the dam-size frequency distribution and repeated censuses of dams along the 58-km river. The ratio fell from 19:1 when no beaver dams were present to < 3:1 after 7 years of flows favourable for beaver. We investigated the dam failure-flood intensity relationship in three independent trials (experimental floods) featuring peak discharge ranging from 37 to 65 m3 s-1. Major damage (breach ??? 3-m wide) occurred at ??? 20% of monitored dams (n = 7-86) and a similar or higher proportion was moderately damaged. We detected neither a relationship between dam size and damage level nor a flood discharge threshold for initiating major damage. Dam constituent materials appeared to control the probability of major damage at low (attenuated) flood magnitude. We conclude that environmental flows prescribed to sustain desert riparian forest will also reduce beaver-created lentic habitat in a non-linear manner determined by both beaver dam and flood attributes. Consideration of both desirable and undesirable consequences of ecological engineering by beaver is important when optimizing environmental flows to meet ecological and socioeconomic goals. ?? 2010 John Wiley & Sons, Ltd.
Brown, A. D.; Pham, Q.; Fortin, E. V.; ...
2016-11-10
Here, three-dimensional x-ray tomography (XRT) provides a nondestructive technique to characterize the size, shape, and location of damage in dynamically loaded metals. A shape-fitting method comprising the inertia tensors of individual damage sites was applied to study differences of spall damage development in face-centered-cubic (FCC) and hexagonal-closed-packed (HCP) multicrystals and for a suite of experiments on high-purity copper to examine the influence of loading kinetics on the spall damage process. Applying a volume-weighted average to the best-fit ellipsoidal aspect-ratios allows a quantitative assessment for determining the extent of damage coalescence present in a shocked metal. It was found that incipientmore » transgranular HCP spall damage nucleates in a lenticular shape and is heavily oriented along particular crystallographic slip directions. In polycrystalline materials, shape distributions indicate that a decrease in the tensile loading rate leads to a transition to coalesced damage dominance and that the plastic processes driving void growth are time dependent.« less
Wang, Junsheng; Fan, Zhiqiang; Zhao, Yile; Song, Younan; Chu, Hui; Song, Wendong; Song, Yongxin; Pan, Xinxiang; Sun, Yeqing; Li, Dongqing
2016-03-17
Space radiation brings uneven damages to cells. The detection of the distribution of cell damage plays a very important role in radiation medicine and the related research. In this paper, a new hand-held microfluidic flow cytometer was developed to evaluate the degree of radiation damage of cells. The device we propose overcomes the shortcomings (e.g., large volume and high cost) of commercial flow cytometers and can evaluate the radiation damage of cells accurately and quickly with potential for onsite applications. The distribution of radiation-damaged cells is analyzed by a simultaneous detection of immunofluorescence intensity of γ-H2AX and resistance pulse sensor (RPS) signal. The γ-H2AX fluorescence intensity provides information of the degree of radiation damage in cells. The ratio of the number of cells with γ-H2AX fluorescence signals to the total numbers of cells detected by RPS indicates the percentage of the cells that are damaged by radiation. The comparison experiment between the developed hand-held microfluidic flow cytometer and a commercial confocal microscope indicates a consistent and comparable detection performance.
NASA Astrophysics Data System (ADS)
Wang, Junsheng; Fan, Zhiqiang; Zhao, Yile; Song, Younan; Chu, Hui; Song, Wendong; Song, Yongxin; Pan, Xinxiang; Sun, Yeqing; Li, Dongqing
2016-03-01
Space radiation brings uneven damages to cells. The detection of the distribution of cell damage plays a very important role in radiation medicine and the related research. In this paper, a new hand-held microfluidic flow cytometer was developed to evaluate the degree of radiation damage of cells. The device we propose overcomes the shortcomings (e.g., large volume and high cost) of commercial flow cytometers and can evaluate the radiation damage of cells accurately and quickly with potential for onsite applications. The distribution of radiation-damaged cells is analyzed by a simultaneous detection of immunofluorescence intensity of γ-H2AX and resistance pulse sensor (RPS) signal. The γ-H2AX fluorescence intensity provides information of the degree of radiation damage in cells. The ratio of the number of cells with γ-H2AX fluorescence signals to the total numbers of cells detected by RPS indicates the percentage of the cells that are damaged by radiation. The comparison experiment between the developed hand-held microfluidic flow cytometer and a commercial confocal microscope indicates a consistent and comparable detection performance.
Sepehrinezhad, Alireza; Toufigh, Vahab
2018-05-25
Ultrasonic wave attenuation is an effective descriptor of distributed damage in inhomogeneous materials. Methods developed to measure wave attenuation have the potential to provide an in-site evaluation of existing concrete structures insofar as they are accurate and time-efficient. In this study, material classification and distributed damage evaluation were investigated based on the sinusoidal modeling of the response from the through-transmission ultrasonic tests on polymer concrete specimens. The response signal was modeled as single or the sum of damping sinusoids. Due to the inhomogeneous nature of concrete materials, model parameters may vary from one specimen to another. Therefore, these parameters are not known in advance and should be estimated while the response signal is being received. The modeling procedure used in this study involves a data-adaptive algorithm to estimate the parameters online. Data-adaptive algorithms are used due to a lack of knowledge of the model parameters. The damping factor was estimated as a descriptor of the distributed damage. The results were compared in two different cases as follows: (1) constant excitation frequency with varying concrete mixtures and (2) constant mixture with varying excitation frequencies. The specimens were also loaded up to their ultimate compressive strength to investigate the effect of distributed damage in the response signal. The results of the estimation indicated that the damping was highly sensitive to the change in material inhomogeneity, even in comparable mixtures. In addition to the proposed method, three methods were employed to compare the results based on their accuracy in the classification of materials and the evaluation of the distributed damage. It is shown that the estimated damping factor is not only sensitive to damage in the final stages of loading, but it is also applicable in evaluating micro damages in the earlier stages providing a reliable descriptor of damage. In addition, the modified amplitude ratio method is introduced as an improvement of the classical method. The proposed methods were validated to be effective descriptors of distributed damage. The presented models were also in good agreement with the experimental data. Copyright © 2018 Elsevier B.V. All rights reserved.
Ubiquity of Benford's law and emergence of the reciprocal distribution
Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.
2016-04-07
In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less
NASA Astrophysics Data System (ADS)
Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.
2016-12-01
Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with roughly similar water table decline, locations in the subsurface having a higher probability for the existence of compressible material occur more than that in the location with a lower probability. Such estimate of spatial probability distribution is useful to analyze the uncertainty of land subsidence.
The exact probability distribution of the rank product statistics for replicated experiments.
Eisinga, Rob; Breitling, Rainer; Heskes, Tom
2013-03-18
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Passage of a ''Nemesis''-like object through the planetary system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, J.G.
1985-09-01
The probability that passing stars could have perturbed the hypothetical stellar companion, Nemesis, into an orbit that penetrates the planetary system is about 15%. The planetary orbits crossed by Nemesis would become highly eccentric, and some would even become hyperbolic. If Nemesis ejects Jupiter from the solar system, the semimajor axis of the orbit of Nemesis would shrink down to a few hundred AU. The probability of any object in the inner edge of the Oort cloud at a semimajor axis of 2 x 10/sup 4/ AU having passed inside the orbit of Saturn is about 80%. The apparent lackmore » of damage to the planetary orbits implies a low probability of there being any objects more massive than 0.02 M/sub sun/ in the inner edge of the Oort comet cloud. However, several objects less massive than 0.01 M/sub sun/ or 10 Jupiter masses could pass through the planetary system from the Oort cloud without causing any significant damage to the planetary orbits. The lack of damage to the planetary system also requires that no black dwarf more massive than 0.05 M/sub sun/ has entered the planetary system from interstellar space.« less
Modeling the probability distribution of peak discharge for infiltrating hillslopes
NASA Astrophysics Data System (ADS)
Baiamonte, Giorgio; Singh, Vijay P.
2017-07-01
Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.
Constructing inverse probability weights for continuous exposures: a comparison of methods.
Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S
2014-03-01
Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.
The HMDS Coating Flaw Removal Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monticelli, M V; Nostrand, M C; Mehta, N
2008-10-24
In many high energy laser systems, optics with HMDS sol gel antireflective coatings are placed in close proximity to each other making them particularly susceptible to certain types of strong optical interactions. During the coating process, halo shaped coating flaws develop around surface digs and particles. Depending on the shape and size of the flaw, the extent of laser light intensity modulation and consequent probability of damaging downstream optics may increase significantly. To prevent these defects from causing damage, a coating flaw removal tool was developed that deploys a spot of decane with a syringe and dissolves away the coatingmore » flaw. The residual liquid is evacuated leaving an uncoated circular spot approximately 1mm in diameter. The resulting uncoated region causes little light intensity modulation and thus has a low probability of causing damage in optics downstream from the mitigated flaw site.« less
22 CFR 211.9 - Liability for loss damage or improper distribution of commodities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... specify how such losses occurred; (E) Obtain copies of port and/or ship records including scale weights... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Liability for loss damage or improper... § 211.9 Liability for loss damage or improper distribution of commodities. (Where the instructions in...
22 CFR 211.9 - Liability for loss damage or improper distribution of commodities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... specify how such losses occurred; (E) Obtain copies of port and/or ship records including scale weights... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Liability for loss damage or improper... § 211.9 Liability for loss damage or improper distribution of commodities. (Where the instructions in...
NASA Astrophysics Data System (ADS)
Wu, Bitao; Wu, Gang; Yang, Caiqian; He, Yi
2018-05-01
A novel damage identification method for concrete continuous girder bridges based on spatially-distributed long-gauge strain sensing is presented in this paper. First, the variation regularity of the long-gauge strain influence line of continuous girder bridges which changes with the location of vehicles on the bridge is studied. According to this variation regularity, a calculation method for the distribution regularity of the area of long-gauge strain history is investigated. Second, a numerical simulation of damage identification based on the distribution regularity of the area of long-gauge strain history is conducted, and the results indicate that this method is effective for identifying damage and is not affected by the speed, axle number and weight of vehicles. Finally, a real bridge test on a highway is conducted, and the experimental results also show that this method is very effective for identifying damage in continuous girder bridges, and the local element stiffness distribution regularity can be revealed at the same time. This identified information is useful for maintaining of continuous girder bridges on highways.
Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction
NASA Technical Reports Server (NTRS)
Cohen, A. C.
1971-01-01
A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.
Ditching Investigation of a 1/12-Scale Model of the Douglas F3D-2 Airplane, TED No. NACA DE 381
NASA Technical Reports Server (NTRS)
Fisher, Lloyd J.; Thompson, William C.
1955-01-01
An investigation of a 1/12- scale dynamically similar model of the Douglas F3D-2 airplane was made in calm water to observe the ditching behavior and to determine the safest procedure for making an emergency water landing. Various conditions of damage were simulated to determine the behavior which probably would occur in a full-scale ditching. The behavior of the model was determined from motion-picture records, time- history acceleration records, and visual observations. It was concluded that the airplane should be ditched at a medium high attitude of about 8 degrees with the landing flaps down 40 degrees. In calm water the airplane will probably make a smooth run of about 550 feet and will have a maximum longitudinal deceleration of about 3g. The fuselage bottom will probably be damaged enough to allow the fuselage to fill with water very rapidly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, G.; Liu, X.; De Schutter, G.
2007-06-15
Self-compacting concrete, as a new smart building material with various advanced properties, has been used for a wide range of structures and infrastructures. However little investigation have been reported on the properties of Self-compacting when it is exposed to elevated temperatures. Previous experiments on fire test have shown the differences between high performance concrete and traditional concrete at elevated temperature. This difference is largely depending on the microstructural properties of concrete matrix, i.e. the cement paste, especially on the porosity, pore size distribution and the connectivity of pores in cement pastes. In this contribution, the investigations are focused on themore » cement paste. The phase distribution and microstructural changes of self-compacting cement paste at elevated temperatures are examined by mercury intrusion porosimetry and scanning electron microscopy. The chemical decomposition of self-compacting cement paste at different temperatures is determined by thermogravimetric analysis. The experimental results of self-compacting cement paste are compared with those of high performance cement paste and traditional cement paste. It was found that self-compacting cement paste shows a higher change of the total porosity in comparison with high performance cement paste. When the temperature is higher than 700 deg. C, a dramatic loss of mass was observed in the self-compacting cement paste samples with addition of limestone filler. This implies that the SCC made by this type of self-compacting cement paste will probably show larger damage once exposed to fire. Investigation has shown that 0.5 kg/m{sup 3} of Polypropylene fibers in the self-compacting cement paste can avoid the damage efficiently.« less
Statistical characteristics of mechanical heart valve cavitation in accelerated testing.
Wu, Changfu; Hwang, Ned H C; Lin, Yu-Kweng M
2004-07-01
Cavitation damage has been observed on mechanical heart valves (MHVs) undergoing accelerated testing. Cavitation itself can be modeled as a stochastic process, as it varies from beat to beat of the testing machine. This in-vitro study was undertaken to investigate the statistical characteristics of MHV cavitation. A 25-mm St. Jude Medical bileaflet MHV (SJM 25) was tested in an accelerated tester at various pulse rates, ranging from 300 to 1,000 bpm, with stepwise increments of 100 bpm. A miniature pressure transducer was placed near a leaflet tip on the inflow side of the valve, to monitor regional transient pressure fluctuations at instants of valve closure. The pressure trace associated with each beat was passed through a 70 kHz high-pass digital filter to extract the high-frequency oscillation (HFO) components resulting from the collapse of cavitation bubbles. Three intensity-related measures were calculated for each HFO burst: its time span; its local root-mean-square (LRMS) value; and the area enveloped by the absolute value of the HFO pressure trace and the time axis, referred to as cavitation impulse. These were treated as stochastic processes, of which the first-order probability density functions (PDFs) were estimated for each test rate. Both the LRMS value and cavitation impulse were log-normal distributed, and the time span was normal distributed. These distribution laws were consistent at different test rates. The present investigation was directed at understanding MHV cavitation as a stochastic process. The results provide a basis for establishing further the statistical relationship between cavitation intensity and time-evolving cavitation damage on MHV surfaces. These data are required to assess and compare the performance of MHVs of different designs.
ABSTRACT The widespread use of titanium dioxide (TiO2) nanoparticles in consumer products increases the probability of exposure to humans and the environment. Although TiO2 nanoparticles have been shown to induce DNA damage (comet assay) and chromosome damage (micronucleus ass...
Comparing flood loss models of different complexity
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno
2013-04-01
Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.
NASA Technical Reports Server (NTRS)
Plante, Ianik; Cucinotta, Francis A.
2011-01-01
Radiolytic species are formed approximately 1 ps after the passage of ionizing radiation through matter. After their formation, they diffuse and chemically react with other radiolytic species and neighboring biological molecules, leading to various oxidative damage. Therefore, the simulation of radiation chemistry is of considerable importance to understand how radiolytic species damage biological molecules [1]. The step-by-step simulation of chemical reactions is difficult, because the radiolytic species are distributed non-homogeneously in the medium. Consequently, computational approaches based on Green functions for diffusion-influenced reactions should be used [2]. Recently, Green functions for more complex type of reactions have been published [3-4]. We have developed exact random variate generators of these Green functions [5], which will allow us to use them in radiation chemistry codes. Moreover, simulating chemistry using the Green functions is which is computationally very demanding, because the probabilities of reactions between each pair of particles should be evaluated at each timestep [2]. This kind of problem is well adapted for General Purpose Graphic Processing Units (GPGPU), which can handle a large number of similar calculations simultaneously. These new developments will allow us to include more complex reactions in chemistry codes, and to improve the calculation time. This code should be of importance to link radiation track structure simulations and DNA damage models.
Marutani, Yoko; Yamauchi, Yasuo; Miyoshi, Akihito; Inoue, Kanako; Ikeda, Ken-ichi; Mizutani, Masaharu; Sugimoto, Yukihiro
2014-12-11
Photosystems of higher plants alleviate heat-induced damage in the presence of light under moderate stressed conditions; however, in the absence of light (i.e., in the dark), the same plants are damaged more easily. (Yamauchi and Kimura, 2011) We demonstrate that regulating photochemical energy transfer in heat-treated wheat at 40 °C with light contributed to heat tolerance of the photosystem. Chlorophyll fluorescence analysis using heat-stressed wheat seedlings in light showed increased non-photochemical quenching (NPQ) of chlorophyll fluorescence, which was due to thermal dissipation that was increased by state 1 to state 2 transition. Transmission electron microscopy revealed structural changes in thylakoid membranes, including unstacking of grana regions under heat stress in light. It was accompanied by the phosphorylation of thylakoid proteins such as D1 and D2 proteins and the light harvesting complex II proteins Lhcb1 and Lhcb2. These results suggest that heat stress at 40 °C in light induces state 1 to state 2 transition for the preferential excitation of photosystem I (PSI) by phosphorylating thylakoid proteins more strongly. Structural changes of thylakoid membrane also assist the remodeling of photosystems and regulation of energy distribution by transition toward state 2 probably contributes to plastoquione oxidation; thus, light-driven electrons flowing through PSI play a protective role against PSII damage under heat stress.
The Extraction of Post-Earthquake Building Damage Informatiom Based on Convolutional Neural Network
NASA Astrophysics Data System (ADS)
Chen, M.; Wang, X.; Dou, A.; Wu, X.
2018-04-01
The seismic damage information of buildings extracted from remote sensing (RS) imagery is meaningful for supporting relief and effective reduction of losses caused by earthquake. Both traditional pixel-based and object-oriented methods have some shortcoming in extracting information of object. Pixel-based method can't make fully use of contextual information of objects. Object-oriented method faces problem that segmentation of image is not ideal, and the choice of feature space is difficult. In this paper, a new stratage is proposed which combines Convolution Neural Network (CNN) with imagery segmentation to extract building damage information from remote sensing imagery. the key idea of this method includes two steps. First to use CNN to predicate the probability of each pixel and then integrate the probability within each segmentation spot. The method is tested through extracting the collapsed building and uncollapsed building from the aerial image which is acquired in Longtoushan Town after Ms 6.5 Ludian County, Yunnan Province earthquake. The results show that the proposed method indicates its effectiveness in extracting damage information of buildings after earthquake.
NASA Astrophysics Data System (ADS)
Sun, Dongliang; Huang, Guangtuan; Jiang, Juncheng; Zhang, Mingguang; Wang, Zhirong
2013-04-01
Overpressure is one important cause of domino effect in accidents of chemical process equipments. Some models considering propagation probability and threshold values of the domino effect caused by overpressure have been proposed in previous study. In order to prove the rationality and validity of the models reported in the reference, two boundary values of three damage degrees reported were considered as random variables respectively in the interval [0, 100%]. Based on the overpressure data for damage to the equipment and the damage state, and the calculation method reported in the references, the mean square errors of the four categories of damage probability models of overpressure were calculated with random boundary values, and then a relationship of mean square error vs. the two boundary value was obtained, the minimum of mean square error was obtained, compared with the result of the present work, mean square error decreases by about 3%. Therefore, the error was in the acceptable range of engineering applications, the models reported can be considered reasonable and valid.
Augspurger, Carol K
2013-01-01
Climate change, with both warmer spring temperatures and greater temperature fluctuations, has altered phenologies, possibly leading to greater risk of spring frost damage to temperate deciduous woody plants. Phenological observations of 20 woody species from 1993 to 2012 in Trelease Woods, Champaign County, Illinois, USA, were used to identify years with frost damage to vegetative and reproductive phases. Local temperature records were used in combination with the phenological observations to determine what combinations of the two were associated with damage. Finally, a long-term temperature record (1889-1992) was evaluated to determine if the frequency of frost damage has risen in recent decades. Frost < or = -1.7 degrees C occurred after bud-break in 14 of the 20 years of observation. Frost damage occurred in five years in the interior and in three additional years at only the forest edge. The degree of damage varied with species, life stage, tissue (vegetative or reproductive), and phenological phase. Common features associated with the occurrence of damage to interior plants were (1) a period of unusual warm temperatures in March, followed by (2) a frost event in April with a minimum temperature < or = -6.1 degrees C with (3) a period of 16-33 days between the extremes. In the long-term record, 10 of 124 years met these conditions, but the yearly probability of frost damage increased significantly, from 0.03 during 1889-1979 to 0.21 during 1980-2012. When the criteria were "softened" to < or = -1.7 degrees C in April and an interval of 16-37 days, 31 of 124 years met the conditions, and the yearly damage probability increased significantly to 0.19 for 1889-1979 and 0.42 for 1980-2012. In this forest, the combination of warming trends and temperature variability (extremes) associated with climate change is having ecologically important effects, making previously rare frost damage events more common.
Statistical analysis of low-rise building damage caused by the San Fernando earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholl, R.E.
1974-02-01
An empirical investigation of damage to low-rise buildings in two selected control areas within Glendale, California, caused by the ground motion precipitated by the San Fernando earthquake of February 9, 1971 is summarized. The procedures for obtaining the appropriate data and the methodology used in deriving ground motion-damage relationships are described. Motion-damage relationships are derived for overall damage and for the most frequently damaged building components. Overall motion-damage relationships are expressed in terms of damage incidence (damage ratio) and damage cost (damage cost factor). The motion-damage relationships derived from the earthquake data are compared with similar data obtained for lou-risemore » buildings subjected to ground motion generated by an underground nuclear explosion. Overall comparison results show that for the same spectral acceleration, the earthquake caused slightly more damage. Differences in ground-motion characteristics for the two types of disturbances provide the most probable explanation for this discrepancy. (auth)« less
Modelling the Probability of Landslides Impacting Road Networks
NASA Astrophysics Data System (ADS)
Taylor, F. E.; Malamud, B. D.
2012-04-01
During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m2, which closely matches the value of A¯ L for the triggered landslide inventories. We further find that over the 500 iterations, the probability of a given number of road blocks occurring on any given iteration, p(NBL) as a function of NBL, follows reasonably well a three-parameter inverse gamma probability density distribution with an exponential rollover (i.e., the most frequent value) at NBL = 1.3. In this paper we have begun to calculate the probability of the number of landslides blocking roads during a triggering event, and have found that this follows an inverse-gamma distribution, which is similar to that found for the statistics of landslide areas resulting from triggers. As we progress to model more realistic road networks, this work will aid in both long-term and disaster management for road networks by allowing probabilistic assessment of road network potential damage during different magnitude landslide triggering event scenarios.
Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow
NASA Astrophysics Data System (ADS)
Gupta, Atma Ram; Kumar, Ashwani
2017-12-01
Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.
NASA Astrophysics Data System (ADS)
Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.
2017-10-01
The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.
Defensive effects of extrafloral nectaries in quaking aspen differ with scale.
Mortensen, Brent; Wagner, Diane; Doak, Patricia
2011-04-01
The effects of plant defenses on herbivory can differ among spatial scales. This may be particularly common with indirect defenses, such as extrafloral nectaries (EFNs), that attract predatory arthropods and are dependent on predator distribution, abundance, and behavior. We tested the defensive effects of EFNs in quaking aspen (Populus tremuloides Michx.) against damage by a specialist herbivore, the aspen leaf miner (Phyllocnistis populiella Cham.), at the scale of individual leaves and entire ramets (i.e., stems). Experiments excluding crawling arthropods revealed that the effects of aspen EFNs differed at the leaf and ramet scales. Crawling predators caused similar reductions in the percent leaf area mined on individual leaves with and without EFNs. However, the extent to which crawling predators increased leaf miner mortality and, consequently, reduced mining damage increased with EFN expression at the ramet scale. Thus, aspen EFNs provided a diffuse defense, reducing damage to leaves across a ramet regardless of leaf-scale EFN expression. We detected lower leaf miner damage and survival unassociated with crawling predators on EFN-bearing leaves, suggesting that direct defenses (e.g., chemical defenses) were stronger on leaves with than without EFNs. Greater direct defenses on EFN-bearing leaves may reduce the probability of losing these leaves and thus weakening ramet-scale EFN defense. Aspen growth was not related to EFN expression or the presence of crawling predators over the course of a single season. Different effects of aspen EFNs at the leaf and ramet scales suggest that future studies may benefit from examining indirect defenses simultaneously at multiple scales.
Sulfonamide toxicity in brook trout
Wood, E.M.; Yasutake, W.T.; Snieszko, S.F.
1954-01-01
Sterility was observed in female brook trout that were treated with sulfamerazine at frequent intervals for 2 years to control endemic furunculosis. Feeding sulfamerazine for a period of 8 months caused massive kidney damage similar to that observed in humans who develop allergies to “sulfa” drugs. Kidney damage of the type observed would probably cause renal insufficiency which would handicap any physiological function including reproduction. Feeding sulfonamides for periods up to 13 weeks did not produce kidney damage.
Nonlinear features for product inspection
NASA Astrophysics Data System (ADS)
Talukder, Ashit; Casasent, David P.
1999-03-01
Classification of real-time X-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non-invasive detection of defective product items on a conveyor belt. We discuss the extraction of new features that allow better discrimination between damaged and clean items (pistachio nuts). This feature extraction and classification stage is the new aspect of this paper; our new maximum representation and discriminating feature (MRDF) extraction method computes nonlinear features that are used as inputs to a new modified k nearest neighbor classifier. In this work, the MRDF is applied to standard features (rather than iconic data). The MRDF is robust to various probability distributions of the input class and is shown to provide good classification and new ROC (receiver operating characteristic) data.
A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis
NASA Astrophysics Data System (ADS)
Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann
2017-04-01
The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.
NASA Technical Reports Server (NTRS)
Plante, Ianik; Ponomarev, Artem L.; Wu, Honglu; Blattnig, Steve; George, Kerry
2014-01-01
The formation of DNA double-strand breaks (DSBs) and chromosome aberrations is an important consequence of ionizing radiation. To simulate DNA double-strand breaks and the formation of chromosome aberrations, we have recently merged the codes RITRACKS (Relativistic Ion Tracks) and NASARTI (NASA Radiation Track Image). The program RITRACKS is a stochastic code developed to simulate detailed event-by-event radiation track structure: [1] This code is used to calculate the dose in voxels of 20 nm, in a volume containing simulated chromosomes, [2] The number of tracks in the volume is calculated for each simulation by sampling a Poisson distribution, with the distribution parameter obtained from the irradiation dose, ion type and energy. The program NASARTI generates the chromosomes present in a cell nucleus by random walks of 20 nm, corresponding to the size of the dose voxels, [3] The generated chromosomes are located within domains which may intertwine, and [4] Each segment of the random walks corresponds to approx. 2,000 DNA base pairs. NASARTI uses pre-calculated dose at each voxel to calculate the probability of DNA damage at each random walk segment. Using the location of double-strand breaks, possible rejoining between damaged segments is evaluated. This yields various types of chromosomes aberrations, including deletions, inversions, exchanges, etc. By performing the calculations using various types of radiations, it will be possible to obtain relative biological effectiveness (RBE) values for several types of chromosome aberrations.
Evaluating 99mTc Auger electrons for targeted tumor radiotherapy by computational methods.
Tavares, Adriana Alexandre S; Tavares, João Manuel R S
2010-07-01
Technetium-99m (99mTc) has been widely used as an imaging agent but only recently has been considered for therapeutic applications. This study aims to analyze the potential use of 99mTc Auger electrons for targeted tumor radiotherapy by evaluating the DNA damage and its probability of correct repair and by studying the cellular kinetics, following 99mTc Auger electron irradiation in comparison to iodine-131 (131I) beta minus particles and astatine-211 (211At) alpha particle irradiation. Computational models were used to estimate the yield of DNA damage (fast Monte Carlo damage algorithm), the probability of correct repair (Monte Carlo excision repair algorithm), and cell kinetic effects (virtual cell radiobiology algorithm) after irradiation with the selected particles. The results obtained with the algorithms used suggested that 99mTc CKMMX (all M-shell Coster-Kroning--CK--and super-CK transitions) electrons and Auger MXY (all M-shell Auger transitions) have a therapeutic potential comparable to high linear energy transfer 211At alpha particles and higher than 131I beta minus particles. All the other 99mTc electrons had a therapeutic potential similar to 131I beta minus particles. 99mTc CKMMX electrons and Auger MXY presented a higher probability to induce apoptosis than 131I beta minus particles and a probability similar to 211At alpha particles. Based on the results here, 99mTc CKMMX electrons and Auger MXY are useful electrons for targeted tumor radiotherapy.
1978-03-01
for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented
Jermacane, Daiga; Waite, Thomas David; Beck, Charles R; Bone, Angie; Amlôt, Richard; Reacher, Mark; Kovats, Sari; Armstrong, Ben; Leonardi, Giovanni; James Rubin, G; Oliver, Isabel
2018-03-07
The longer term impact of flooding on health is poorly understood. In 2015, following widespread flooding in the UK during winter 2013/14, Public Health England launched the English National Study of Flooding and Health. The study identified a higher prevalence of probable psychological morbidity one year after exposure to flooding. We now report findings after two years. In year two (2016), a self-assessment questionnaire including flooding-related exposures and validated instruments to screen for probable anxiety, depression and post-traumatic stress disorder (PTSD) was sent to all participants who consented to further follow-up. Participants exposure status was categorised according to responses in year one; we assessed for exposure to new episodes of flooding and continuing flood-related problems in respondents homes. We calculated the prevalence and odds ratio for each outcome by exposure group relative to unaffected participants, adjusting for confounders. We used the McNemar test to assess change in outcomes between year one and year two. In year two, 1064 (70%) people responded. The prevalence of probable psychological morbidity remained elevated amongst flooded participants [n = 339] (depression 10.6%, anxiety 13.6%, PTSD 24.5%) and disrupted participants [n = 512] (depression 4.1%, anxiety 6.4%, PTSD 8.9%), although these rates were reduced compared to year one. A greater reduction in anxiety 7.6% (95% confidence interval [CI] 4.6-9.9) was seen than depression 3.8% (95% CI 1.5-6.1) and PTSD: 6.6% (95% CI 3.9-9.2). Exposure to flooding was associated with a higher odds of anxiety (adjusted odds ratio [aOR] 5.2 95%, 95% CI 1.7-16.3) and depression (aOR 8.7, 95% CI 1.9-39.8) but not PTSD. Exposure to disruption caused by flooding was not significantly associated with probable psychological morbidity. Persistent damage in the home as a consequence of the original flooding event was reported by 119 participants (14%). The odds of probable psychological morbidity amongst flooded participants who reported persistent damage, compared with those who were unaffected, were significantly higher than the same comparison amongst flooded participants who did not report persistent damage. This study shows a continuance of probable psychological morbidity at least two years following exposure to flooding. Commissioners and providers of health and social care services should be aware that the increased need in populations may be prolonged. Efforts to resolve persistent damage to homes may reduce the risk of probable psychological morbidity.
NASA Astrophysics Data System (ADS)
Ghotbi, Abdoul R.
2014-09-01
The seismic behavior of skewed bridges has not been well studied compared to straight bridges. Skewed bridges have shown extensive damage, especially due to deck rotation, shear keys failure, abutment unseating and column-bent drift. This research, therefore, aims to study the behavior of skewed and straight highway overpass bridges both with and without taking into account the effects of Soil-Structure Interaction (SSI) due to near-fault ground motions. Due to several sources of uncertainty associated with the ground motions, soil and structure, a probabilistic approach is needed. Thus, a probabilistic methodology similar to the one developed by the Pacific Earthquake Engineering Research Center (PEER) has been utilized to assess the probability of damage due to various levels of shaking using appropriate intensity measures with minimum dispersions. The probabilistic analyses were performed for various bridge configurations and site conditions, including sand ranging from loose to dense and clay ranging from soft to stiff, in order to evaluate the effects. The results proved a considerable susceptibility of skewed bridges to deck rotation and shear keys displacement. It was also found that SSI had a decreasing effect on the damage probability for various demands compared to the fixed-base model without including SSI. However, deck rotation for all types of the soil and also abutment unseating for very loose sand and soft clay showed an increase in damage probability compared to the fixed-base model. The damage probability for various demands has also been found to decrease with an increase of soil strength for both sandy and clayey sites. With respect to the variations in the skew angle, an increase in skew angle has had an increasing effect on the amplitude of the seismic response for various demands. Deck rotation has been very sensitive to the increase in the skew angle; therefore, as the skew angle increased, the deck rotation responded accordingly. Furthermore, abutment unseating showed an increasing trend due to an increase in skew angle for both fixed-base and SSI models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werth, D.; NOEMAIL), A.; Shine, G.
Recent data sets for three meteorological phenomena with the potential to inflict damage on SRS facilities - tornadoes, straight winds, and heavy precipitation - are analyzed using appropriate statistical techniques to estimate occurrence probabilities for these events in the future. Summaries of the results for DOE-mandated return periods and comparisons to similar calculations performed in 1998 by Weber, et al., are given. Using tornado statistics for the states of Georgia and South Carolina, we calculated the probability per year of any location within a 2⁰ square area surrounding SRS being struck by a tornado (the ‘strike’ probability) and the probabilitymore » that any point will experience winds above set thresholds. The strike probability was calculated to be 1.15E-3 (1 chance in 870) per year and wind speeds for DOE mandated return periods of 50,000 years, 125,000 years, and 1E+7 years (USDOE, 2012) were estimated to be 136 mph, 151 mph and 221 mph, respectively. In 1998 the strike probability for SRS was estimated to be 3.53 E-4 and the return period wind speeds were 148 mph every 50,000 years and 180 mph every 125,000 years. A 1E+7 year tornado wind speed was not calculated in 1998; however a 3E+6 year wind speed was 260 mph. The lower wind speeds resulting from this most recent analysis are largely due to new data since 1998, and to a lesser degree differences in the models used. By contrast, default tornado wind speeds taken from ANSI/ANS-2.3-2011 are somewhat higher: 161 mph for return periods of 50,000 years, 173 mph every 125,000 years, and 230 mph every 1E+7 years (ANS, 2011). Although the ANS model and the SRS models are very similar, the region defined in ANS 2.3 that encompasses the SRS also includes areas of the Great Plains and lower Midwest, regions with much higher occurrence frequencies of strong tornadoes. The SRS straight wind values associated with various return periods were calculated by fitting existing wind data to a Gumbel distribution, and extrapolating the values for any return period from the tail of that function. For the DOE mandated return periods, we expect straight winds of 123 mph every 2500 years, and 132mph every 6250 years at any point within the SRS. These values are similar to those from the W98 report (which also used the Gumbel distribution for wind speeds) which gave wind speeds of 115mph and 122 mph for return periods of 2500 years and 6250 years, respectively. For extreme precipitation accumulation periods, we compared the fits of three different theoretical extreme-value distributions, and in the end decided to maintain the use of the Gumbel distribution for each period. The DOE mandated 6-hr accumulated rainfall for return periods of 2500 years and 6250 years was estimated as 7.8 inches and 8.4 inches, respectively. For the 24- hr rainfall return periods of 10,000 years and 25,000 years, total rainfall estimates were 10.4 inches and 11.1 inches, respectively. These values are substantially lower than comparable values provided in the W98 report. This is largely a consequence of the W98 use of a different extreme value distribution with its corresponding higher extreme probabilities.« less
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
Heterogeneous Defensive Naval Weapon Assignment To Swarming Threats In Real Time
2016-03-01
threat Damage potential of target t if it hits the ship [integer from 0 to 3] _ ttarget phit Probability that target t hits the ship [probability...secondary weapon systems on target t [integer] _ tsec phit Probability that secondary weapon systems launched from target t hit the ship...pairing. These parameters are calculated as follows: 310 _ _t t tpriority target threat target phit = × × (3.1) 3_ 10 _ _t t tsec priority sec
Population-based Study of Risk Polymorphisms Associated with Vascular Disorders and Dementia
Teijido, Óscar; Carril, Juan Carlos; Cacabelos, Ramón
2017-01-01
Introduction: Cardiovascular and neurodegenerative disorders are among the major causes of mortality in the developed countries. Population studies evaluate the genetic risk, i.e. the probability of an individual carrying a specific disease-associated polymorphism. Identification of risk polymorphisms is essential for an accurate diagnosis or prognosis of a number of pathologies. Aims: The aim of this study was to characterize the influence of risk polymorphisms associated with lipid metabolism, hypertension, thrombosis, and dementia, in a large population of Spanish individuals affected by a variety of brain and vascular disorders as well as metabolic syndrome. Material & Method: We performed a cross-sectional study on 4415 individuals from a widespread regional distribution in Spain (48.15% males and 51.85% females), with mental, neurodegenerative, cerebrovascular, and metabolic disorders. We evaluated polymorphisms in 20 genes involved in obesity, vascular and cardiovascular risk, and dementia in our population and compared it with representative Spanish and European populations. Risk polymorphisms in ACE, AGT(235), IL6(573), PSEN1, and APOE (specially the APOE-ε4 allele) are representative of our population as compared to the reference data of Spanish and European individuals. Conclusion: The significantly higher distribution of risk polymorphisms in PSEN1 and APOE-ε4 is characteristic of a representative number of patients with Alzheimer’s disease; whereas polymorphisms in ACE, AGT(235), and IL6(573), are most probably related with the high number of patients with metabolic syndrome or cerebrovascular damage. PMID:29081698
Asteroid Risk Assessment: A Probabilistic Approach.
Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth
2016-02-01
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.
[Reversible damages: loss of chance].
Béry, Alain
2013-03-01
Chance is the probability that a particular event may or may not occur and, in this sense, a loss of chance∗∗ can be defined as the missed opportunities resulting from the loss of the possibility that a favorable event will occur (a contrario, the failure to take risks)∗∗∗. This is a self-imposed liability that should be distinguished from the final damage. Moral damage is a notion that is very close to loss of chance although it is based on indemnification from the final damage of an affliction or malady. © EDP Sciences, SFODF, 2013.
Vulnerability and seismic damage scenarios for Barcelona (Spain) by using GIS
NASA Astrophysics Data System (ADS)
Lantada, N.; Pujades, L. G.; Barbat, A.
2003-04-01
An integrated GIS-based analysis (using ArcView GIS) is performed in order to estimate damage scenarios for VI, VII and VIII EMS-98 seismic intensities in Barcelona (Spain). The analysis of vulnerability and damage of individual buildings is performed according to a simplified method developed by Giovanazzi and Lagomarsino at the University of Genoa (Italy). An index of average vulnerability is associated to each building typology, which may be refined on the basis of behaviour modifiers. The index allows identification of an analytical relationship between seismic input (intensity) and damage, described by a binomial distribution. This methodology, which is based on the EMS-98 building typologies and preserves the compatibility with preceding methods, is applied to the two main residential building typologies of Barcelona, that is, unreinforced masonry and reinforced concrete buildings. Then, the specific residential buildings of Barcelona are classified in different groups characterized by a similar seismic behaviour. Therefore, all buildings belonging to each typology are cast in the most probable class according to vulnerability. In this way, the average vulnerability index is associated to each building typology of Barcelona and it is refined later on the basis of behaviour modifiers, linked to the number of stories, the year of construction and their state of maintenance. The ability of GIS tools to store, manage, analyse, and display the large amount of spatial and tabular data involved in this study allows to map average vulnerability indexes, and damage for the entire city. That is, vulnerability and damage scenarios. The obtained results show a high vulnerability and high expected seismic damage. For a VI degree of intensity, the maximum expected damage is in the range 15-30 % in the oldest zones of the city, the downtown, while for intensity VII it is in the range 45-60%. The developed GIS tool involves a friendly interface that allows new models and database information to be included in the same framework. As a new step to the seismic risk assessment, and in addition to the building characteristics, the destination of the building, as well as the essential buildings, and the density of population for census zones, have been included in the GIS database. Combining this information with the previous damage maps we will be able to obtain more complete damage scenarios including, deaths, injuries, and homeless.
NASA Astrophysics Data System (ADS)
Ismaila, Aminu; Md Kasmani, Rafiziana; Meng-Hock, Koh; Termizi Ramli, Ahmad
2017-10-01
This paper deals with the assessment of external explosion, resulting from accidental release of jet fuel from the large commercial airliner in the nuclear power plant (NPP). The study used three widely prediction methods such as Trinitrotoluene (TNT), multi energy (TNO) and Baker-strehow (BST) to determine the unconfined vapour cloud explosion (UVCE) overpressure within the distances of 100-1400 m from the first impact location. The containment building was taken as the reference position. The fatalities of persons and damage of structures was estimated using probit methodology. Analysis of the results shows that both reactor building and control-room will be highly damaged with risk consequences and probability, depending on the assumed position of the crash. The structures at the radial distance of 600 m may suffer major structural damage with probability ranging from 25 to 100%. The minor structural damage was observed throughout the bounds of the plant complex. The people working within 250 m radius may get affected with different fatality ranging from 28 to 100%. The findings of this study is valuable to evaluate the safety improvement needed on the NPP site and on the risk and consequences associated with the hydrocarbon fuel release/fires due to external hazards.
Nanosecond multiple pulse measurements and the different types of defects
NASA Astrophysics Data System (ADS)
Wagner, Frank R.; Natoli, Jean-Yves; Beaudier, Alexandre; Commandré, Mireille
2017-11-01
Laser damage measurements with multiple pulses at constant fluence (S-on-1 measurements) are of high practical importance for design and validation of high power photonic instruments. Using nanosecond lasers, it has been recognized long ago that single pulse laser damage is linked to fabrication related defects. Models describing the laser damage probability as the probability of encounter between the high fluence region of the laser beam and the fabrication related defects are thus widely used to analyze the measurements. Nanosecond S-on-1 tests often reveal the "fatigue effect", i.e. a decrease of the laser damage threshold with increasing pulse number. Most authors attribute this effect to cumulative material modifications operated by the first pulses. In this paper we discuss the different situations that are observed upon nanosecond S-on-1 measurements of several different materials using different wavelengths and speak in particular about the defects involved in the laser damage mechanism. These defects may be fabrication-related or laser-induced, stable or evolutive, cumulative or of short lifetime. We will show that the type of defect that is dominating an S-on-1 experiment depends on the wavelength and the material under test and give examples from measurements of nonlinear optical crystals, fused silica and oxide mixture coatings.
Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick
2012-01-01
Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.
Climatic variables are associated with the prevalence of biliary trematodes in otters.
Sherrard-Smith, Ellie; Chadwick, Elizabeth A; Cable, Joanne
2013-08-01
Parasites with complex life cycles are expected to be disproportionately affected by climate change. Knowledge of current associations with weather and host-parasite interactions is therefore essential for the inference of future distributions. The Eurasian otter, Lutra lutra, is exposed to a range of parasites due to its large home range and use of terrestrial, freshwater and marine habitats. As such, it can act as a sentinel species for generalist parasites. Here we consider two biliary parasites recently reported in the United Kingdom, Pseudamphistomum truncatum and Metorchis albidus (Trematoda, Opisthorchiidae), and ask whether there are associations between abiotic factors (season, temperature, rainfall and the North Atlantic Oscillation) and the prevalence and intensities of these parasites in otters (n = 586). To control for biotic interactions we first examined whether particular sub-groups of the otter population (grouped by sex, age-class and condition) are more prone to infection and whether any damage is associated with the presence of these parasites. Even though mean intensities of the smaller trematode, P. truncatum (28.3 worms/host), were much higher than M. albidus (4.1), both parasite species had similar impacts on the otter. The distributions of parasites on host sexes were similar, but males suffered greater damage and regardless of sex, parasite intensity increased in older hosts. The probability of infection with either parasite was negatively associated with ground frost, minimum temperatures and rainfall, but was positively associated with warm long-term average temperatures. Although it is widely accepted that multiple variables influence parasite distributions, to our knowledge this is one of only a few studies to examine the combined impact of biotic and abiotic variables on parasites with complex life cycles within their wild definitive host. Identifying such associations can give greater accuracy to predictions concerning the distribution and spread of trematodes with future climate change. Copyright © 2013 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.
Burst wait time simulation of CALIBAN reactor at delayed super-critical state
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humbert, P.; Authier, N.; Richard, B.
2012-07-01
In the past, the super prompt critical wait time probability distribution was measured on CALIBAN fast burst reactor [4]. Afterwards, these experiments were simulated with a very good agreement by solving the non-extinction probability equation [5]. Recently, the burst wait time probability distribution has been measured at CEA-Valduc on CALIBAN at different delayed super-critical states [6]. However, in the delayed super-critical case the non-extinction probability does not give access to the wait time distribution. In this case it is necessary to compute the time dependent evolution of the full neutron count number probability distribution. In this paper we present themore » point model deterministic method used to calculate the probability distribution of the wait time before a prescribed count level taking into account prompt neutrons and delayed neutron precursors. This method is based on the solution of the time dependent adjoint Kolmogorov master equations for the number of detections using the generating function methodology [8,9,10] and inverse discrete Fourier transforms. The obtained results are then compared to the measurements and Monte-Carlo calculations based on the algorithm presented in [7]. (authors)« less
Steady state, relaxation and first-passage properties of a run-and-tumble particle in one-dimension
NASA Astrophysics Data System (ADS)
Malakar, Kanaya; Jemseena, V.; Kundu, Anupam; Vijay Kumar, K.; Sabhapandit, Sanjib; Majumdar, Satya N.; Redner, S.; Dhar, Abhishek
2018-04-01
We investigate the motion of a run-and-tumble particle (RTP) in one dimension. We find the exact probability distribution of the particle with and without diffusion on the infinite line, as well as in a finite interval. In the infinite domain, this probability distribution approaches a Gaussian form in the long-time limit, as in the case of a regular Brownian particle. At intermediate times, this distribution exhibits unexpected multi-modal forms. In a finite domain, the probability distribution reaches a steady-state form with peaks at the boundaries, in contrast to a Brownian particle. We also study the relaxation to the steady-state analytically. Finally we compute the survival probability of the RTP in a semi-infinite domain with an absorbing boundary condition at the origin. In the finite interval, we compute the exit probability and the associated exit times. We provide numerical verification of our analytical results.
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.
NASA Technical Reports Server (NTRS)
Bell, V. L.
1980-01-01
The potential damage to electrical equipment caused by the release of carbon fibers from burning commercial airliners is assessed in terms of annual expected costs and maximum losses at low probabilities of occurrence. A materials research program to provide alternate or modified composite materials for aircraft structures is reviewed.
ERIC Educational Resources Information Center
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Monte Carlo Treatment of Displacement Damage in Bandgap Engineered HgCdTe Detectors
NASA Technical Reports Server (NTRS)
Fodness, Bryan C.; Marshall, Paul W.; Reed, Robert A.; Jordan, Thomas M.; Pickel, James C.; Jun, Insoo; Xapsos, Michael A.; Burke, Edward A.
2003-01-01
The conclusion are: 1. Description of NIEL calculation for short, mid, and longwave HgCdTe material compositions. 2. Full recoil spectra details captured and analyzed Importance of variance in high Z materials. 3. Can be applied directly to calculate damage distributions in arrays. 4. Future work will provide comparisons of measured array damage with calculated NIEL and damage energy distributions. 5. Technique to assess the full recoil spectrum behavior is extendable to other materials.
Probability distributions of the electroencephalogram envelope of preterm infants.
Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro
2015-06-01
To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Probability of growth of small damage sites on the exit surface of fused silica optics.
Negres, Raluca A; Abdulla, Ghaleb M; Cross, David A; Liao, Zhi M; Carr, Christopher W
2012-06-04
Growth of laser damage on fused silica optical components depends on several key parameters including laser fluence, wavelength, pulse duration, and site size. Here we investigate the growth behavior of small damage sites on the exit surface of SiO₂ optics under exposure to tightly controlled laser pulses. Results demonstrate that the onset of damage growth is not governed by a threshold, but is probabilistic in nature and depends both on the current size of a damage site and the laser fluence to which it is exposed. We also develop models for use in growth prediction. In addition, we show that laser exposure history also influences the behavior of individual sites.
Ponomarev, Artem L; Costes, Sylvain V; Cucinotta, Francis A
2008-11-01
We computed probabilities to have multiple double-strand breaks (DSB), which are produced in DNA on a regional scale, and not in close vicinity, in volumes matching the size of DNA damage foci, of a large chromatin loop, and in the physical volume of DNA containing the HPRT (human hypoxanthine phosphoribosyltransferase) locus. The model is based on a Monte Carlo description of DSB formation by heavy ions in the spatial context of the entire human genome contained within the cell nucleus, as well as at the gene sequence level. We showed that a finite physical volume corresponding to a visible DNA repair focus, believed to be associated with one DSB, can contain multiple DSB due to heavy ion track structure and the DNA supercoiled topography. A corrective distribution was introduced, which was a conditional probability to have excess DSB in a focus volume, given that there was already one present. The corrective distribution was calculated for 19.5 MeV/amu N ions, 3.77 MeV/amu alpha-particles, 1000 MeV/amu Fe ions, and X-rays. The corrected initial DSB yield from the experimental data on DNA repair foci was calculated. The DSB yield based on the corrective function converts the focus yield into the DSB yield, which is comparable with the DSB yield based on the earlier PFGE experiments. The distribution of DSB within the physical limits of the HPRT gene was analyzed by a similar method as well. This corrective procedure shows the applicability of the model and empowers the researcher with a tool to better analyze focus statistics. The model enables researchers to analyze the DSB yield based on focus statistics in real experimental situations that lack one-to-one focus-to-DSB correspondance.
Flood Frequency Curves - Use of information on the likelihood of extreme floods
NASA Astrophysics Data System (ADS)
Faber, B.
2011-12-01
Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.
Accident hazard evaluation and control decisions on forested recreation sites
Lee A. Paine
1971-01-01
Accident hazard associated with trees on recreation sites is inherently concerned with probabilities. The major factors include the probabilities of mechanical failure and of target impact if failure occurs, the damage potential of the failure, and the target value. Hazard may be evaluated as the product of these factors; i.e., expected loss during the current...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... adversely affect plant safety, and would have no adverse effect on the probability of any accident. For the accidents that involve damage or melting of the fuel in the reactor core, fuel rod integrity has been shown to be unaffected by extended burnup under consideration; therefore, the probability of an accident...
Modeling extreme hurricane damage in the United States using generalized Pareto distribution
NASA Astrophysics Data System (ADS)
Dey, Asim Kumer
Extreme value distributions are used to understand and model natural calamities, man made catastrophes and financial collapses. Extreme value theory has been developed to study the frequency of such events and to construct a predictive model so that one can attempt to forecast the frequency of a disaster and the amount of damage from such a disaster. In this study, hurricane damages in the United States from 1900-2012 have been studied. The aim of the paper is three-fold. First, normalizing hurricane damage and fitting an appropriate model for the normalized damage data. Secondly, predicting the maximum economic damage from a hurricane in future by using the concept of return period. Finally, quantifying the uncertainty in the inference of extreme return levels of hurricane losses by using a simulated hurricane series, generated by bootstrap sampling. Normalized hurricane damage data are found to follow a generalized Pareto distribution. tion. It is demonstrated that standard deviation and coecient of variation increase with the return period which indicates an increase in uncertainty with model extrapolation.
Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2014-01-01
Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016
Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2013-01-01
Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.
Stylized facts in internal rates of return on stock index and its derivative transactions
NASA Astrophysics Data System (ADS)
Pichl, Lukáš; Kaizoji, Taisei; Yamano, Takuya
2007-08-01
Universal features in stock markets and their derivative markets are studied by means of probability distributions in internal rates of return on buy and sell transaction pairs. Unlike the stylized facts in normalized log returns, the probability distributions for such single asset encounters incorporate the time factor by means of the internal rate of return, defined as the continuous compound interest. Resulting stylized facts are shown in the probability distributions derived from the daily series of TOPIX, S & P 500 and FTSE 100 index close values. The application of the above analysis to minute-tick data of NIKKEI 225 and its futures market, respectively, reveals an interesting difference in the behavior of the two probability distributions, in case a threshold on the minimal duration of the long position is imposed. It is therefore suggested that the probability distributions of the internal rates of return could be used for causality mining between the underlying and derivative stock markets. The highly specific discrete spectrum, which results from noise trader strategies as opposed to the smooth distributions observed for fundamentalist strategies in single encounter transactions may be useful in deducing the type of investment strategy from trading revenues of small portfolio investors.
Probabilistic Reasoning for Robustness in Automated Planning
NASA Technical Reports Server (NTRS)
Schaffer, Steven; Clement, Bradley; Chien, Steve
2007-01-01
A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.
Radiation damage to DNA in DNA-protein complexes.
Spotheim-Maurizot, M; Davídková, M
2011-06-03
The most aggressive product of water radiolysis, the hydroxyl (OH) radical, is responsible for the indirect effect of ionizing radiations on DNA in solution and aerobic conditions. According to radiolytic footprinting experiments, the resulting strand breaks and base modifications are inhomogeneously distributed along the DNA molecule irradiated free or bound to ligands (polyamines, thiols, proteins). A Monte-Carlo based model of simulation of the reaction of OH radicals with the macromolecules, called RADACK, allows calculating the relative probability of damage of each nucleotide of DNA irradiated alone or in complexes with proteins. RADACK calculations require the knowledge of the three dimensional structure of DNA and its complexes (determined by X-ray crystallography, NMR spectroscopy or molecular modeling). The confrontation of the calculated values with the results of the radiolytic footprinting experiments together with molecular modeling calculations show that: (1) the extent and location of the lesions are strongly dependent on the structure of DNA, which in turns is modulated by the base sequence and by the binding of proteins and (2) the regions in contact with the protein can be protected against the attack by the hydroxyl radicals via masking of the binding site and by scavenging of the radicals. 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
Statistical study of single and multiple pulse laser-induced damage in glasses.
Gallais, L; Natoli, J; Amra, C
2002-12-16
Single and multiple pulse laser damage studies are performed in Suprasil silica and BK-7 borosilicate glasses. Experiments are made in the bulk of materials at 1.064microm with nanosecond pulses, using an accurate and reliable measurement system. By means of a statistical study on laser damage probabilities, we demonstrate that the same nano-precursors could be involved in the multiple shot and single shot damage process. A damage mechanism with two stages is then proposed to explain the results. Firstly, a pre-damage process, corresponding to material changes at a microscopic level, leads the precursor to a state that can induce a one-pulse damage. And secondly a final damage occurs, with a mechanism identical to the single shot case. For each material, a law is found to predict the precursor life-time. We can then deduce the long term life of optical elements in high-power laser systems submitted to multipulse irradiation.
Mathematical Model to estimate the wind power using four-parameter Burr distribution
NASA Astrophysics Data System (ADS)
Liu, Sanming; Wang, Zhijie; Pan, Zhaoxu
2018-03-01
When the real probability of wind speed in the same position needs to be described, the four-parameter Burr distribution is more suitable than other distributions. This paper introduces its important properties and characteristics. Also, the application of the four-parameter Burr distribution in wind speed prediction is discussed, and the expression of probability distribution of output power of wind turbine is deduced.
Assessment of mean annual flood damage using simple hydraulic modeling and Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Oubennaceur, K.; Agili, H.; Chokmani, K.; Poulin, J.; Marceau, P.
2016-12-01
Floods are the most frequent and the most damaging natural disaster in Canada. The issue of assessing and managing the risk related to this disaster has become increasingly crucial for both local and national authorities. Brigham, a municipality located in southern Quebec Province, is one of the heavily affected regions by this disaster because of frequent overflows of the Yamaska River reaching two to three times per year. Since Irene Hurricane which struck the region in 2011, causing considerable socio-economic damage, the implementation of mitigation measures has become a major priority for this municipality. To do this, a preliminary study to evaluate the risk to which this region is exposed is essential. Conventionally, approaches only based on the characterization of the hazard (e.g. floodplains extensive, flood depth) are generally adopted to study the risk of flooding. In order to improve the knowledge of this risk, a Monte Carlo simulation approach combining information on the hazard with vulnerability-related aspects has been developed. This approach integrates three main components: (1) hydrologic modelling aiming to establish a probability-discharge function which associate each measured discharge to its probability of occurrence (2) hydraulic modeling that aims to establish the relationship between the discharge and the water stage at each building (3) damage study that aims to assess the buildings damage using damage functions. The damage is estimated according to the water depth defined as the difference between the water level and the elevation of the building's first floor. The application of the proposed approach allows estimating the annual average cost of damage caused by floods on buildings. The obtained results will be useful for authorities to support their decisions on risk management and prevention against this disaster.
Entropy Methods For Univariate Distributions in Decision Analysis
NASA Astrophysics Data System (ADS)
Abbas, Ali E.
2003-03-01
One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.
Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations
NASA Astrophysics Data System (ADS)
Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.
2018-04-01
Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.
Nims, Robert J; Durney, Krista M; Cigan, Alexander D; Dusséaux, Antoine; Hung, Clark T; Ateshian, Gerard A
2016-02-06
This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process.
Nims, Robert J.; Durney, Krista M.; Cigan, Alexander D.; Hung, Clark T.; Ateshian, Gerard A.
2016-01-01
This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process. PMID:26855751
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2013-11-01
Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.
Computer simulation of random variables and vectors with arbitrary probability distribution laws
NASA Technical Reports Server (NTRS)
Bogdan, V. M.
1981-01-01
Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.
1991-01-01
The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howell, W.L.; Rapp, L.M.; Williams, T.P.
1982-02-01
Distribution of melanosomes across the retinal pigment epithelium of hooded rats (Long-Evans) is studied at the light microscopic and electron microscopic levels. This distribution is shown to be nonuniform: more melanosomes exist in the periphery than elsewhere and, importantly, there are very few melanosomes in a restricted area of the central portion of the superior hemisphere compared with the corresponding part of the inferior hemisphere. The region with fewest melanosomes is precisely the one that is highly susceptible to light damage. Because this region is the same in both pigmented and albino eyes, the paucity of melanin in this regionmore » is not the cause of its great sensitivity to light damage. Nor does light cause the nonuniform distribution of melanin. A possible explanation, involving a proposed vestigial tapetum, is given in order to explain the correlation of melanosome counts and sensitivity to light damage.« less
Force Density Function Relationships in 2-D Granular Media
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.
2004-01-01
An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms
Strand, Janne M; Scheffler, Katja; Bjørås, Magnar; Eide, Lars
2014-06-01
The cellular genomes are continuously damaged by reactive oxygen species (ROS) from aerobic processes. The impact of DNA damage depends on the specific site as well as the cellular state. The steady-state level of DNA damage is the net result of continuous formation and subsequent repair, but it is unknown to what extent heterogeneous damage distribution is caused by variations in formation or repair of DNA damage. Here, we used a restriction enzyme/qPCR based method to analyze DNA damage in promoter and coding regions of four nuclear genes: the two house-keeping genes Gadph and Tbp, and the Ndufa9 and Ndufs2 genes encoding mitochondrial complex I subunits, as well as mt-Rnr1 encoded by mitochondrial DNA (mtDNA). The distribution of steady-state levels of damage varied in a site-specific manner. Oxidative stress induced damage in nDNA to a similar extent in promoter and coding regions, and more so in mtDNA. The subsequent removal of damage from nDNA was efficient and comparable with recovery times depending on the initial damage load, while repair of mtDNA was delayed with subsequently slower repair rate. The repair was furthermore found to be independent of transcription or the transcription-coupled repair factor CSB, but dependent on cellular ATP. Our results demonstrate that the capacity to repair DNA is sufficient to remove exogenously induced damage. Thus, we conclude that the heterogeneous steady-state level of DNA damage in promoters and coding regions is caused by site-specific DNA damage/modifications that take place under normal metabolism. Copyright © 2014 Elsevier B.V. All rights reserved.
An evaluation of procedures to estimate monthly precipitation probabilities
NASA Astrophysics Data System (ADS)
Legates, David R.
1991-01-01
Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.
When violence pays: a cost-benefit analysis of aggressive behavior in animals and humans.
Georgiev, Alexander V; Klimczuk, Amanda C E; Traficonte, Daniel M; Maestripieri, Dario
2013-07-18
An optimization analysis of human behavior from a comparative perspective can improve our understanding of the adaptiveness of human nature. Intra-specific competition for resources provides the main selective pressure for the evolution of violent aggression toward conspecifics, and variation in the fitness benefits and costs of aggression can account for inter-specific and inter-individual differences in aggressiveness. When aggression reflects competition for resources, its benefits vary in relation to the characteristics of the resources (their intrinsic value, abundance, spatial distribution, and controllability) while its costs vary in relation to the characteristics of organisms and how they fight (which, in turn, affects the extent to which aggression entails risk of physical injury or death, energetic depletion, exposure to predation, psychological and physiological stress, or damage to social relationships). Humans are a highly aggressive species in comparison to other animals, probably as a result of an unusually high benefit-to-cost ratio for intra-specific aggression. This conclusion is supported by frequent and widespread occurrence of male-male coalitionary killing and by male-female sexual coercion. Sex differences in violent aggression in humans and other species probably evolved by sexual selection and reflect different optimal competitive strategies for males and females.
Information-theoretical noninvasive damage detection in bridge structures
NASA Astrophysics Data System (ADS)
Sudu Ambegedara, Amila; Sun, Jie; Janoyan, Kerop; Bollt, Erik
2016-11-01
Damage detection of mechanical structures such as bridges is an important research problem in civil engineering. Using spatially distributed sensor time series data collected from a recent experiment on a local bridge in Upper State New York, we study noninvasive damage detection using information-theoretical methods. Several findings are in order. First, the time series data, which represent accelerations measured at the sensors, more closely follow Laplace distribution than normal distribution, allowing us to develop parameter estimators for various information-theoretic measures such as entropy and mutual information. Second, as damage is introduced by the removal of bolts of the first diaphragm connection, the interaction between spatially nearby sensors as measured by mutual information becomes weaker, suggesting that the bridge is "loosened." Finally, using a proposed optimal mutual information interaction procedure to prune away indirect interactions, we found that the primary direction of interaction or influence aligns with the traffic direction on the bridge even after damaging the bridge.
Schmidt, Kevin M.; Ellen, Stephen D.; Haugerud, Ralph A.; Peterson, David M.; Phelps, Geoffery A.
1995-01-01
Damage to pavement and near-surface utility pipes, caused by the October 17, 1989, Loma Prieta earthquake, provide indicators for ground deformation in a 663 km2 area near the southwest margin of the Santa Clara Valley, California. The spatial distribution of 1284 sites of such damage documents the extent and distribution of detectable ground deformation. Damage was concentrated in four zones, three of which are near previously mapped faults. The zone through Los Gatos showed the highest concentration of damage, as well as evidence for pre- and post-earthquake deformation. Damage along the foot of the Santa Cruz Mountains reflected shortening that is consistent with movement along reverse faults in the region and with the hypothesis that tectonic strain is distributed widely across numerous faults in the California Coast Ranges.
q-Gaussian distributions of leverage returns, first stopping times, and default risk valuations
NASA Astrophysics Data System (ADS)
Katz, Yuri A.; Tian, Li
2013-10-01
We study the probability distributions of daily leverage returns of 520 North American industrial companies that survive de-listing during the financial crisis, 2006-2012. We provide evidence that distributions of unbiased leverage returns of all individual firms belong to the class of q-Gaussian distributions with the Tsallis entropic parameter within the interval 1
Asmeda, R; Noorlaila, A; Norziah, M H
2016-01-15
This research was conducted to investigate the effects of different grinding techniques (dry, semi-wet and wet) of milled rice grains on the damaged starch and particle size distribution of flour produced from a new variety, MR263, specifically related to the pasting and thermal profiles. The results indicated that grinding techniques significantly (p<0.05) affected starch damage content and particle size distribution of rice flour. Wet grinding process yields flour with lowest percentage of starch damage (7.37%) and finest average particle size (8.52μm). Pasting and gelatinization temperature was found in the range of 84.45-89.63°C and 59.86-75.31°C, respectively. Dry ground flour attained the lowest pasting and gelatinization temperature as shown by the thermal and pasting profiles. Correlation analysis revealed that percentage of damaged starch granules had a significant, negative relationship with pasting temperature while average particle size distribution had a significant, strong negative relationship with gelatinization temperature. Copyright © 2015 Elsevier Ltd. All rights reserved.
O'Connor, Kelly M; Rittenhouse, Chadwick D; Millspaugh, Joshua J; Rittenhouse, Tracy A G
2015-01-01
Box turtles (Terrapene carolina) are widely distributed but vulnerable to population decline across their range. Using distance sampling, morphometric data, and an index of carapace damage, we surveyed three-toed box turtles (Terrapene carolina triunguis) at 2 sites in central Missouri, and compared differences in detection probabilities when transects were walked by one or two observers. Our estimated turtle densities within forested cover was less at the Thomas S. Baskett Wildlife Research and Education Center, a site dominated by eastern hardwood forest (d = 1.85 turtles/ha, 95% CI [1.13, 3.03]) than at the Prairie Fork Conservation Area, a site containing a mix of open field and hardwood forest (d = 4.14 turtles/ha, 95% CI [1.99, 8.62]). Turtles at Baskett were significantly older and larger than turtles at Prairie Fork. Damage to the carapace did not differ significantly between the 2 populations despite the more prevalent habitat management including mowing and prescribed fire at Prairie Fork. We achieved improved estimates of density using two rather than one observer at Prairie Fork, but negligible differences in density estimates between the two methods at Baskett. Error associated with probability of detection decreased at both sites with the addition of a second observer. We provide demographic data on three-toed box turtles that suggest the use of a range of habitat conditions by three-toed box turtles. This case study suggests that habitat management practices and their impacts on habitat composition may be a cause of the differences observed in our focal populations of turtles.
Bearing damage assessment using Jensen-Rényi Divergence based on EEMD
NASA Astrophysics Data System (ADS)
Singh, Jaskaran; Darpe, A. K.; Singh, S. P.
2017-03-01
An Ensemble Empirical Mode Decomposition (EEMD) and Jensen Rényi divergence (JRD) based methodology is proposed for the degradation assessment of rolling element bearings using vibration data. The EEMD decomposes vibration signals into a set of intrinsic mode functions (IMFs). A systematic methodology to select IMFs that are sensitive and closely related to the fault is proposed in the paper. The change in probability distribution of the energies of the sensitive IMFs is measured through JRD which acts as a damage identification parameter. Evaluation of JRD with sensitive IMFs makes it largely unaffected by change/fluctuations in operating conditions. Further, an algorithm based on Chebyshev's inequality is applied to JRD to identify exact points of change in bearing health and remove outliers. The identified change points are investigated for fault classification as possible locations where specific defect initiation could have taken place. For fault classification, two new parameters are proposed: 'α value' and Probable Fault Index, which together classify the fault. To standardize the degradation process, a Confidence Value parameter is proposed to quantify the bearing degradation value in a range of zero to unity. A simulation study is first carried out to demonstrate the robustness of the proposed JRD parameter under variable operating conditions of load and speed. The proposed methodology is then validated on experimental data (seeded defect data and accelerated bearing life test data). The first validation on two different vibration datasets (inner/outer) obtained from seeded defect experiments demonstrate the effectiveness of JRD parameter in detecting a change in health state as the severity of fault changes. The second validation is on two accelerated life tests. The results demonstrate the proposed approach as a potential tool for bearing performance degradation assessment.
Greenwood, J. Arthur; Landwehr, J. Maciunas; Matalas, N.C.; Wallis, J.R.
1979-01-01
Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.
Univariate Probability Distributions
ERIC Educational Resources Information Center
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
A probability space for quantum models
NASA Astrophysics Data System (ADS)
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Hurricane Hugo: South Carolina Forest Land Research and Management Related to the Storm
Jacqueline L. Haymond; William R. Harms; [Editors
1996-01-01
Hurricane Hugo was probably one of the most destructive hurricanes to assault the forests of the Eastern United States in recorded history. Four and one-half million acres were damaged in North Carolina and South Carolina, an estimated 21.4 billion board feet of timber were destroyed or damaged, and several federally listed endangered species (red-cockaded woodpecker,...
Qiu, Lei; Yuan, Shenfang; Mei, Hanfei; Fang, Fang
2016-02-26
Structural Health Monitoring (SHM) technology is considered to be a key technology to reduce the maintenance cost and meanwhile ensure the operational safety of aircraft structures. It has gradually developed from theoretic and fundamental research to real-world engineering applications in recent decades. The problem of reliable damage monitoring under time-varying conditions is a main issue for the aerospace engineering applications of SHM technology. Among the existing SHM methods, Guided Wave (GW) and piezoelectric sensor-based SHM technique is a promising method due to its high damage sensitivity and long monitoring range. Nevertheless the reliability problem should be addressed. Several methods including environmental parameter compensation, baseline signal dependency reduction and data normalization, have been well studied but limitations remain. This paper proposes a damage propagation monitoring method based on an improved Gaussian Mixture Model (GMM). It can be used on-line without any structural mechanical model and a priori knowledge of damage and time-varying conditions. With this method, a baseline GMM is constructed first based on the GW features obtained under time-varying conditions when the structure under monitoring is in the healthy state. When a new GW feature is obtained during the on-line damage monitoring process, the GMM can be updated by an adaptive migration mechanism including dynamic learning and Gaussian components split-merge. The mixture probability distribution structure of the GMM and the number of Gaussian components can be optimized adaptively. Then an on-line GMM can be obtained. Finally, a best match based Kullback-Leibler (KL) divergence is studied to measure the migration degree between the baseline GMM and the on-line GMM to reveal the weak cumulative changes of the damage propagation mixed in the time-varying influence. A wing spar of an aircraft is used to validate the proposed method. The results indicate that the crack propagation under changing structural boundary conditions can be monitored reliably. The method is not limited by the properties of the structure, and thus it is feasible to be applied to composite structure.
Three-Dimensional Geometric Nonlinear Contact Stress Analysis of Riveted Joints
NASA Technical Reports Server (NTRS)
Shivakumar, Kunigal N.; Ramanujapuram, Vivek
1998-01-01
The problems associated with fatigue were brought into the forefront of research by the explosive decompression and structural failure of the Aloha Airlines Flight 243 in 1988. The structural failure of this airplane has been attributed to debonding and multiple cracking along the longitudinal lap splice riveted joint in the fuselage. This crash created what may be termed as a minor "Structural Integrity Revolution" in the commercial transport industry. Major steps have been taken by the manufacturers, operators and authorities to improve the structural airworthiness of the aging fleet of airplanes. Notwithstanding, this considerable effort there are still outstanding issues and concerns related to the formulation of Widespread Fatigue Damage which is believed to have been a contributing factor in the probable cause of the Aloha accident. The lesson from this accident was that Multiple-Site Damage (MSD) in "aging" aircraft can lead to extensive aircraft damage. A strong candidate in which MSD is highly probable to occur is the riveted lap joint.
NASA Astrophysics Data System (ADS)
Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad
2017-10-01
The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.
Building a database for statistical characterization of ELMs on DIII-D
NASA Astrophysics Data System (ADS)
Fritch, B. J.; Marinoni, A.; Bortolon, A.
2017-10-01
Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Work probability distribution and tossing a biased coin
NASA Astrophysics Data System (ADS)
Saha, Arnab; Bhattacharjee, Jayanta K.; Chakraborty, Sagar
2011-01-01
We show that the rare events present in dissipated work that enters Jarzynski equality, when mapped appropriately to the phenomenon of large deviations found in a biased coin toss, are enough to yield a quantitative work probability distribution for the Jarzynski equality. This allows us to propose a recipe for constructing work probability distribution independent of the details of any relevant system. The underlying framework, developed herein, is expected to be of use in modeling other physical phenomena where rare events play an important role.
Ditching Tests of a 1/18-Scale Model of the Lockheed Constellation Airplane
NASA Technical Reports Server (NTRS)
Fisher, Lloyd J.; Morris, Garland J.
1948-01-01
Tests were made of a 1/18-scale dynamically similar model of the Lockheed Constellation airplane to investigate its ditching characteristics and proper ditching technique. Scale-strength bottoms were used to reproduce probable damage to the fuselage. The model was landed in calm water at the Langley tank no. 2 monorail. Various landing attitudes, speeds, and fuselage configuration were simulated. The behavior of the model was determined from visual observations, by recording the longitudinal decelerations, and by taking motion pictures of the ditchings. Data are presented in tabular form, sequence photographs, and time-history deceleration curves. It was concluded that the airplane should be ditched at a medium nose-high landing attitude with the landing flaps full down. The airplane will probably make a deep run with heavy spray and may even dive slightly. The fuselage will be damaged and leak substantially but in calm water probably will not flood rapidly. Maximum longitudinal decelerations in a calm-water ditching will be about 4g.
Hybrid computer technique yields random signal probability distributions
NASA Technical Reports Server (NTRS)
Cameron, W. D.
1965-01-01
Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.
NASA Astrophysics Data System (ADS)
Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming
2018-01-01
This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.
40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B
Code of Federal Regulations, 2014 CFR
2014-07-01
... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...
40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B
Code of Federal Regulations, 2011 CFR
2011-07-01
... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...
Walter G. Thies; Douglas J. Westlind; Mark Loewen; Greg Brenner
2008-01-01
The Malheur model for fire-caused delayed mortality is presented as an easily interpreted graph (mortality-probability calculator) as part of a one-page field guide that allows the user to determine postfire probability of mortality for ponderosa pine (Pinus ponderosa Dougl. ex Laws.). Following both prescribed burns and wildfires, managers need...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joerissen, G.; Zuend, H.
From international nuclear industries fair; Basel, Switzerland (16 Oct 1972). The probability and the consequences of an aircraft crash on a nuclear power plant incorporating a light water reactor are estimated considering the probabilities of an aircraft strike, missile penetration through walls and damage of structures and systems important for safety. The estimated risks are presented in a Farmer diagram and compared with tolerable risk limits. (6 references) (auth)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... probability of any accident. For the accidents that involve damage or melting of the fuel in the reactor core..., the probability of an accident will not be affected. For the accidents in which core remains intact... event of a serious accident, but because the radionuclides contributing most to the dose are short-lived...
Mycosporine-Like Amino Acids: Relevant Secondary Metabolites. Chemical and Ecological Aspects
Carreto, Jose I.; Carignan, Mario O.
2011-01-01
Taxonomically diverse marine, freshwater and terrestrial organisms have evolved the capacity to synthesize, accumulate and metabolize a variety of UV-absorbing substances called mycosporine-like amino acids (MAAs) as part of an overall strategy to diminish the direct and indirect damaging effects of environmental ultraviolet radiation (UVR). Whereas the enzymatic machinery to synthesize MAAs was probably inherited from cyanobacteria ancestors via the endosymbionts hypothesis, metazoans lack this biochemical pathway, but can acquire and metabolize these compounds by trophic transference, symbiotic or bacterial association. In this review we describe the structure and physicochemical properties of MAAs, including the recently discovered compounds and the modern methods used for their isolation and identification, updating previous reviews. On this basis, we review the metabolism and distribution of this unique class of metabolites among marine organism. PMID:21556168
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.
How Can Histograms Be Useful for Introducing Continuous Probability Distributions?
ERIC Educational Resources Information Center
Derouet, Charlotte; Parzysz, Bernard
2016-01-01
The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…
ERIC Educational Resources Information Center
Moses, Tim; Oh, Hyeonjoo J.
2009-01-01
Pseudo Bayes probability estimates are weighted averages of raw and modeled probabilities; these estimates have been studied primarily in nonpsychometric contexts. The purpose of this study was to evaluate pseudo Bayes probability estimates as applied to the estimation of psychometric test score distributions and chained equipercentile equating…
Artificial Boundary Conditions for Finite Element Model Update and Damage Detection
2017-03-01
BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION by Emmanouil Damanakis March 2017 Thesis Advisor: Joshua H. Gordis...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ARTIFICIAL BOUNDARY CONDITIONS FOR FINITE ELEMENT MODEL UPDATE AND DAMAGE DETECTION...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In structural engineering, a finite element model is often
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Wilks, Daniel S.
1993-10-01
Performance of 8 three-parameter probability distributions for representing annual extreme and partial duration precipitation data at stations in the northeastern and southeastern United States is investigated. Particular attention is paid to fidelity on the right tail, through use of a bootstrap procedure simulating extrapolation on the right tail beyond the data. It is found that the beta-κ distribution best describes the extreme right tail of annual extreme series, and the beta-P distribution is best for the partial duration data. The conventionally employed two-parameter Gumbel distribution is found to substantially underestimate probabilities associated with the larger precipitation amounts for both annual extreme and partial duration data. Fitting the distributions using left-censored data did not result in improved fits to the right tail.
Rubin, Ilan N; Ellner, Stephen P; Kessler, André; Morrell, Kimberly A
2015-09-01
1. Plant induced resistance to herbivory affects the spatial distribution of herbivores, as well as their performance. In recent years, theories regarding the benefit to plants of induced resistance have shifted from ideas of optimal resource allocation towards a more eclectic set of theories that consider spatial and temporal plant variability and the spatial distribution of herbivores among plants. However, consensus is lacking on whether induced resistance causes increased herbivore aggregation or increased evenness, as both trends have been experimentally documented. 2. We created a spatial individual-based model that can describe many plant-herbivore systems with induced resistance, in order to analyse how different aspects of induced resistance might affect herbivore distribution, and the total damage to a plant population, during a growing season. 3. We analyse the specific effects on herbivore aggregation of informed herbivore movement (preferential movement to less-damaged plants) and of information transfer between plants about herbivore attacks, in order to identify mechanisms driving both aggregation and evenness. We also investigate how the resulting herbivore distributions affect the total damage to plants and aggregation of damage. 4. Even, random and aggregated herbivore distributions can all occur in our model with induced resistance. Highest levels of aggregation occurred in the models with informed herbivore movement, and the most even distributions occurred when the average number of herbivores per plant was low. With constitutive resistance, only random distributions occur. Damage to plants was spatially correlated, unless plants recover very quickly from damage; herbivore spatial autocorrelation was always weak. 5. Our model and results provide a simple explanation for the apparent conflict between experimental results, indicating that both increased aggregation and increased evenness of herbivores can result from induced resistance. We demonstrate that information transfer from plants to herbivores, and from plants to neighbouring plants, can both be major factors in determining non-random herbivore distributions. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2012-08-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
Zitnay, Jared L; Li, Yang; Qin, Zhao; San, Boi Hoa; Depalle, Baptiste; Reese, Shawn P; Buehler, Markus J; Yu, S Michael; Weiss, Jeffrey A
2017-03-22
Mechanical injury to connective tissue causes changes in collagen structure and material behaviour, but the role and mechanisms of molecular damage have not been established. In the case of mechanical subfailure damage, no apparent macroscale damage can be detected, yet this damage initiates and potentiates in pathological processes. Here, we utilize collagen hybridizing peptide (CHP), which binds unfolded collagen by triple helix formation, to detect molecular level subfailure damage to collagen in mechanically stretched rat tail tendon fascicle. Our results directly reveal that collagen triple helix unfolding occurs during tensile loading of collagenous tissues and thus is an important damage mechanism. Steered molecular dynamics simulations suggest that a likely mechanism for triple helix unfolding is intermolecular shearing of collagen α-chains. Our results elucidate a probable molecular failure mechanism associated with subfailure injuries, and demonstrate the potential of CHP targeting for diagnosis, treatment and monitoring of tissue disease and injury.
Lightning damage to a general aviation aircraft: Description and analysis
NASA Technical Reports Server (NTRS)
Hacker, P. T.
1974-01-01
The damage sustained by a Beechcraft King Air Model B90 aircraft by a single lightning discharge is presented and analyzed. The incident occurred during landing approach at Jackson, Michigan, on Feb. 19, 1971. In addition to the usual melted-metal damage at the lightning attachment points, there was severe implosion-type damage over a large area on the lower right side of the aircraft and impact- and crushing-type damage on the upper and lower surfaces on the left wingtip near the trailing edge. Analyses indicate that the implosion-type damage was probably caused by lightning-generated shock waves, that the impact-and crushing-type damage was caused by magnetic forces, and that the lightning discharge was a multiple strike with at least 11 strokes separated in time by about 4.5 milliseconds. The evidence indicates that the lightning discharge was rather different from the average in character severity.
The impact of tort reform on intensity of treatment: evidence from heart patients.
Avraham, Ronen; Schanzenbach, Max
2015-01-01
This paper analyzes the effect of non-economic damage caps on the treatment intensity of heart attack victims. We focus on whether a patient receives a major intervention in the form of either a coronary artery by-pass or angioplasty. We find strong evidence that treatment intensity declines after a cap on non-economic damages. The probability of receiving a major intervention in the form of either an angioplasty or bypass declines by 1.25-2 percentage points after non-economic damage caps are enacted, and this effect is larger a year or two after reform. However, we also find clear evidence of substitution between major interventions. When doctors have discretion to perform a by-pass and patients have insurance coverage, caps on non-economic damages increase the probability that a by-pass is performed. The effect of non-economic damage caps on costs is not always statistically significant, but in models with state-specific trends, total costs decline by as much as four percent. We conclude that tort reform reduces treatment intensity overall, even though it changes the mix of treatments. Using the Center for Disease Control's Vital Statistics data, we find that tort reform is not associated with an increase in mortality from coronary heart disease; if anything, mortality declines. Copyright © 2014 Elsevier B.V. All rights reserved.
Proctor, Michael C F
2003-05-01
Chlorophyll-fluorescence and infrared gas analyser measurements show saturation of photosynthetic electron flow and CO(2) uptake at generally lower irradiances in Hymenophyllum tunbrigense than in H. wilsonii, but with wide variation in both species (63-189 micromol m(-2) s(-1) PPFD in H. tunbrigense, 129-552 micro mol m(-2) s(-1) PPFD in H. wilsonii), probably related to both site and season. Non-photochemical quenching (at 400 micromol m(-2) s(-1) PPFD) ranged from 2.1 to 8.1, with no significant difference between the species. Pressure-volume curves from thermocouple-psychrometer measurements give full-turgor osmotic potentials of approx. -1.4 MPa in both species, and indicate low apoplast fractions and high cell-wall elastic moduli. Leaves of H. tunbrigense recovered within 24 h from up to 7 d desiccation at water potentials ranging from -40 MPa (74 % relative humidity, RH) to -220 MPa (20 % RH); after 15 or 30 d, desiccation recovery was slower and less complete, and leaves were severely damaged at the highest and lowest humidities. Hymenophyllum wilsonii recovered well from up to 30 d desiccation at -114 and -220 MPa, but at -40 MPa it showed signs of damage after 15 d, and was severely damaged or killed after 30 d. Results are discussed in relation to the ecological and geographical distributions of the two species, and to the adaptive strategies of filmy ferns in general.
Tracking down the links between charged particles and biological response: A UK perspective
NASA Astrophysics Data System (ADS)
Hill, Mark A.
2013-07-01
The UK has a long history of radiobiology research into charged particles, with interest likely to expand in the coming years following the recent government announcement of £250 million to build two proton beam therapy facilities in the UK. A brief overview of research and facilities past and present with respect to radiation protection and oncology along with biological consequences and underlying mechanisms will be presented and discussed. Increased knowledge of the mechanisms underpinning the radiation action on biological systems is important in understanding, not only the risks associated with exposure, but also in optimising radiotherapy treatment of cancer. Ionizing radiation is always in the form of structure tracks which are a unique characteristic of ionizing radiation alone producing damage grossly different and far more biologically effective than endogenous damage. The track structure is the prime determinant of biological response to DNA, with charged particles of increasing LET leading to an increase in the frequency and complexity of clustered DNA damage. High-LET particles will also produce non-homogeneous dose distribution through a cell nucleus resulting in correlated DNA breaks along the path of the particle and an increase in the probability of complex chromosomal rearrangements. However it is now well established that there is variety of phenomena that do not conform to the conventional paradigm of targeted radiobiology, but there is insufficient evidence to assess the implications of these non-targeted effects for radiotherapy or relevance to risk for human health.
Application of oil spill model to marine pollution and risk control problems
NASA Astrophysics Data System (ADS)
Aseev, Nikita; Agoshkov, Valery; Sheloput, Tatyana
2017-04-01
Oil transportation by sea induces challenging problems of environmental control. Millions of tonnes of oil are yearly released during routine ship operations, not to mention vast spills due to different accidents (e.g. tanker collisions, grounding, etc.). Oil pollution is dangerous to marine organisms such as plants, fish and mammals, leading to widespread damage to our planet. In turn, fishery and travel agencies can lose money and clients, and ship operators are obliged to pay huge penalties for environmental pollution. In this work we present the method of accessing oil pollution of marine environment using recently developed oil spill model. The model describes basic processes of the oil slick evolution: oil transport due to currents, drift under the action of wind, spreading on the surface, evaporation, emulsification and dispersion. Such parameters as slick location, mass, density of oil, water content, viscosity and density of "water-in-oil" emulsion can be calculated. We demonstrate how to apply the model to damage calculation problems using a concept of average damage to particular marine area. We also formulate the problem of oil spill risk control, when some accident parameters are not known, but their probability distribution is given. We propose a new algorithm to solve such problems and show results of our model simulations. The work can be interesting to broad environmental, physics and mathematics community. The work is supported by Russian Foundation for Basic Research grant 16-31-00510.
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
On the inequivalence of the CH and CHSH inequalities due to finite statistics
NASA Astrophysics Data System (ADS)
Renou, M. O.; Rosset, D.; Martin, A.; Gisin, N.
2017-06-01
Different variants of a Bell inequality, such as CHSH and CH, are known to be equivalent when evaluated on nonsignaling outcome probability distributions. However, in experimental setups, the outcome probability distributions are estimated using a finite number of samples. Therefore the nonsignaling conditions are only approximately satisfied and the robustness of the violation depends on the chosen inequality variant. We explain that phenomenon using the decomposition of the space of outcome probability distributions under the action of the symmetry group of the scenario, and propose a method to optimize the statistical robustness of a Bell inequality. In the process, we describe the finite group composed of relabeling of parties, measurement settings and outcomes, and identify correspondences between the irreducible representations of this group and properties of outcome probability distributions such as normalization, signaling or having uniform marginals.
Non-destructive evaluation of UV pulse laser-induced damage performance of fused silica optics.
Huang, Jin; Wang, Fengrui; Liu, Hongjie; Geng, Feng; Jiang, Xiaodong; Sun, Laixi; Ye, Xin; Li, Qingzhi; Wu, Weidong; Zheng, Wanguo; Sun, Dunlu
2017-11-24
The surface laser damage performance of fused silica optics is related to the distribution of surface defects. In this study, we used chemical etching assisted by ultrasound and magnetorheological finishing to modify defect distribution in a fused silica surface, resulting in fused silica samples with different laser damage performance. Non-destructive test methods such as UV laser-induced fluorescence imaging and photo-thermal deflection were used to characterize the surface defects that contribute to the absorption of UV laser radiation. Our results indicate that the two methods can quantitatively distinguish differences in the distribution of absorptive defects in fused silica samples subjected to different post-processing steps. The percentage of fluorescence defects and the weak absorption coefficient were strongly related to the damage threshold and damage density of fused silica optics, as confirmed by the correlation curves built from statistical analysis of experimental data. The results show that non-destructive evaluation methods such as laser-induced fluorescence and photo-thermal absorption can be effectively applied to estimate the damage performance of fused silica optics at 351 nm pulse laser radiation. This indirect evaluation method is effective for laser damage performance assessment of fused silica optics prior to utilization.
GIS characterization of spatially distributed lifeline damage
Toprak, Selcuk; O'Rourke, Thomas; Tutuncu, Ilker
1999-01-01
This paper describes the visualization of spatially distributed water pipeline damage following an earthquake using geographical information systems (GIS). Pipeline damage is expressed as a repair rate (RR). Repair rate contours are developed with GIS by dividing the study area into grid cells (n ?? n), determining the number of particular pipeline repairs in each grid cell, and dividing the number of repairs by the length of that pipeline in each cell area. The resulting contour plot is a two-dimensional visualization of point source damage. High damage zones are defined herein as areas with an RR value greater than the mean RR for the entire study area of interest. A hyperbolic relationship between visual display of high pipeline damage zones and grid size, n, was developed. The relationship is expressed in terms of two dimensionless parameters, threshold area coverage (TAC) and dimensionless grid size (DGS). The relationship is valid over a wide range of different map scales spanning approximately 1,200 km2 for the largest portion of the Los Angeles water distribution system to 1 km2 for the Marina in San Francisco. This relationship can aid GIS users to get sufficiently refined, but easily visualized, maps of damage patterns.
Confidence as Bayesian Probability: From Neural Origins to Behavior.
Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F
2015-10-07
Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions. Copyright © 2015 Elsevier Inc. All rights reserved.
Exact probability distribution functions for Parrondo's games
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Exact probability distribution functions for Parrondo's games.
Zadourian, Rubina; Saakian, David B; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
What Can Quantum Optics Say about Computational Complexity Theory?
NASA Astrophysics Data System (ADS)
Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.
2015-02-01
Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.
Bear damage to young Douglas-fir.
Thomas W. Childs; Norman P. Worthington
1955-01-01
Newly-formed sapwood of young conifers has probably been a food for bears since time immemorial, but damage to trees has only recently attracted the attention of foresters. In many localities, all stages of bear-caused wounds can be found, from fresh ones to those covered with callus, On a 3-acre clear-cut, made in 1950 in a rather open part of the 110-year-old stand...
Kai, Takeshi; Yokoya, Akinari; Ukai, Masatoshi; Fujii, Kentaro; Watanabe, Ritsuko
2016-11-01
To simulate the deceleration processes of secondary electrons produced by a high-energy Auger electron in water, and particularly to focus on the spatial and temporal distributions of the secondary electron and the collision events (e.g. ionization, electronic excitation, and dissociative electron attachment) that are involved in the multiplication of lesions at sites of DNA damage. We developed a dynamic Monte Carlo code that considers the Coulombic force between an ejected electron and its parent cation produced by the Auger electron in water. Thus our code can simulate some return electrons to the parent cations. Using the code, we calculated to within the order of femtoseconds the temporal evolution of collision events, the mean energy, and the mean traveling distance (including its spatial probability distribution) of the electron at an ejected energy of 20 eV. Some of the decelerating electrons in water in the Coulombic field were attracted to the ionized atoms (cations) by the Coulombic force within hundreds of femtoseconds, although the force did not significantly enhance the number of ionization, electronic excitation, and dissociative electron attachment collision events leading to water radiolysis. The secondary electrons are decelerated in water by the Coulombic force and recombined to the ionized atoms (cations). Furthermore, the some return electrons might be prehydrated in water layer near the parent cation in DNA if the electrons might be emitted from the DNA. The prehydrated electron originated from the return electron might play a significant role in inducing DNA damage.
Rocca, Maria A; Valsasina, Paola; Damjanovic, Dusan; Horsfield, Mark A; Mesaros, Sarlota; Stosic-Opincal, Tatjana; Drulovic, Jelena; Filippi, Massimo
2013-01-01
To apply voxel-based methods to map the regional distribution of atrophy and T2 hyperintense lesions in the cervical cord of multiple sclerosis (MS) patients with different clinical phenotypes. Brain and cervical cord 3D T1-weighted and T2-weighted scans were acquired from 31 healthy controls (HC) and 77 MS patients (15 clinically isolated syndromes (CIS), 15 relapsing-remitting (RR), 19 benign (B), 15 primary progressive (PP) and 13 secondary progressive (SP) MS). Hyperintense cord lesions were outlined on T2-weighted scans. The T2- and 3D T1-weighted cord images were then analysed using an active surface method which created output images reformatted in planes perpendicular to the estimated cord centre line. These unfolded cervical cord images were co-registered into a common space; then smoothed binary cord masks and lesion masks underwent spatial statistic analysis (SPM8). No cord atrophy was found in CIS patients versus HC, while PPMS had significant cord atrophy. Clusters of cord atrophy were found in BMS versus RRMS, and in SPMS versus RRMS, BMS and PPMS patients, mainly involving the posterior and lateral cord segments. Cord lesion probability maps showed a significantly greater likelihood of abnormalities in RRMS, PPMS and SPMS than in CIS and BMS patients. The spatial distributions of cord atrophy and cord lesions were not correlated. In progressive MS, regional cord atrophy was correlated with clinical disability and impairment in the pyramidal system. Voxel-based assessment of cervical cord damage is feasible and may contribute to a better characterisation of the clinical heterogeneity of MS patients.
Vacuum quantum stress tensor fluctuations: A diagonalization approach
NASA Astrophysics Data System (ADS)
Schiappacasse, Enrico D.; Fewster, Christopher J.; Ford, L. H.
2018-01-01
Large vacuum fluctuations of a quantum stress tensor can be described by the asymptotic behavior of its probability distribution. Here we focus on stress tensor operators which have been averaged with a sampling function in time. The Minkowski vacuum state is not an eigenstate of the time-averaged operator, but can be expanded in terms of its eigenstates. We calculate the probability distribution and the cumulative probability distribution for obtaining a given value in a measurement of the time-averaged operator taken in the vacuum state. In these calculations, we study a specific operator that contributes to the stress-energy tensor of a massless scalar field in Minkowski spacetime, namely, the normal ordered square of the time derivative of the field. We analyze the rate of decrease of the tail of the probability distribution for different temporal sampling functions, such as compactly supported functions and the Lorentzian function. We find that the tails decrease relatively slowly, as exponentials of fractional powers, in agreement with previous work using the moments of the distribution. Our results lend additional support to the conclusion that large vacuum stress tensor fluctuations are more probable than large thermal fluctuations, and may have observable effects.
Measurements of gas hydrate formation probability distributions on a quasi-free water droplet
NASA Astrophysics Data System (ADS)
Maeda, Nobuo
2014-06-01
A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.
Discus: investigating subjective judgment of optic disc damage.
Denniss, Jonathan; Echendu, Damian; Henson, David B; Artes, Paul H
2011-01-01
To describe a software package (Discus) for investigating clinicians' subjective assessment of optic disc damage [diagnostic accuracy in detecting visual field (VF) damage, decision criteria, and agreement with a panel of experts] and to provide reference data from a group of expert observers. Optic disc images were selected from patients with manifest or suspected glaucoma or ocular hypertension who attended the Manchester Royal Eye Hospital. Eighty images came from eyes without evidence of VF loss in at least four consecutive tests (VF negatives), and 20 images from eyes with repeatable VF loss (VF positives). Software was written to display these images in randomized order, for up to 60 s. Expert observers (n = 12) rated optic disc damage on a 5-point scale (definitely healthy, probably healthy, not sure, probably damaged, and definitely damaged). Optic disc damage as determined by the expert observers predicted VF loss with less than perfect accuracy (mean area under receiver-operating characteristic curve, 0.78; range, 0.72 to 0.85). When the responses were combined across the panel of experts, the area under receiver-operating characteristic curve reached 0.87, corresponding to a sensitivity of ∼60% at 90% specificity. Although the observers' performances were similar, there were large differences between the criteria they adopted (p < 0.001), even though all observers had been given identical instructions. Discus provides a simple and rapid means for assessing important aspects of optic disc interpretation. The data from the panel of expert observers provide a reference against which students, trainees, and clinicians may compare themselves. The program and the analyses described in this article are freely accessible from http://www.discusproject.blogspot.com/.
Fragment size distribution in viscous bag breakup of a drop
NASA Astrophysics Data System (ADS)
Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.
2015-11-01
In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 <= We <= 16 for Oh <= 0.1. Experiments are conducted using phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.
An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.
Score distributions of gapped multiple sequence alignments down to the low-probability tail
NASA Astrophysics Data System (ADS)
Fieth, Pascal; Hartmann, Alexander K.
2016-08-01
Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
Distributed road assessment system
Beer, N. Reginald; Paglieroni, David W
2014-03-25
A system that detects damage on or below the surface of a paved structure or pavement is provided. A distributed road assessment system includes road assessment pods and a road assessment server. Each road assessment pod includes a ground-penetrating radar antenna array and a detection system that detects road damage from the return signals as the vehicle on which the pod is mounted travels down a road. Each road assessment pod transmits to the road assessment server occurrence information describing each occurrence of road damage that is newly detected on a current scan of a road. The road assessment server maintains a road damage database of occurrence information describing the previously detected occurrences of road damage. After the road assessment server receives occurrence information for newly detected occurrences of road damage for a portion of a road, the road assessment server determines which newly detected occurrences correspond to which previously detected occurrences of road damage.
Jose Negron
1997-01-01
Classification trees and linear regression analysis were used to build models to predict probabilities of infestation and amount of tree mortality in terms of basal area resulting from roundheaded pine beetle, Dendroctonus adjunctus Blandford, activity in ponderosa pine, Pinus ponderosa Laws., in the Sacramento Mountains, New Mexico. Classification trees were built for...
Measurement of Device Parameters Using Image Recovery Techniques in Large-Scale IC Devices
NASA Technical Reports Server (NTRS)
Scheick, Leif; Edmonds, Larry
2004-01-01
Devices that respond to radiation on a cell level will produce histograms showing the relative frequency of cell damage as a function of damage. The measured distribution is the convolution of distributions from radiation responses, measurement noise, and manufacturing parameters. A method of extracting device characteristics and parameters from measured distributions via mathematical and image subtraction techniques is described.
Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.
2012-06-15
In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less
Oakes, Benjamin Donald; Mattsson, Lars-Göran; Näsman, Per; Glazunov, Andrés Alayón
2018-06-01
Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man-made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems-based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is "replaced" by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst-case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network. © 2017 Society for Risk Analysis.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters
NASA Astrophysics Data System (ADS)
Kim, T.; Kim, Y. S.
2017-12-01
The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).
Modeling highway travel time distribution with conditional probability models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less
Probability of success for phase III after exploratory biomarker analysis in phase II.
Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver
2017-05-01
The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.
Studying the effects of the heat stress on the various layers of human skin using damage function
NASA Astrophysics Data System (ADS)
Aijaz, Mir; Khanday, M. A.
2016-03-01
This paper develops a model to identify the effects of thermal stress on temperature distribution and damage in human dermal regions. The design and selection of the model takes into account many factors effecting the temperature distribution of skin, e.g., thermal conductance, perfusion, metabolic heat generation and thermal protective capabilities of the skin. The transient temperature distribution within the region is simulated using a two-dimensional finite element model of the Pennes’ bioheat equation. The relationship between temperature and time is integrated to view the damage caused to human skin by using Henriques’ model Henriques, F. C., Arch. Pathol. 43 (1947) 489-502]. The Henriques’ damage model is found to be more desirable for use in predicting the threshold of thermal damage. This work can be helpful in both emergency medicines as well as to plastic surgeon in deciding upon a course of action for the treatment of different burn injuries.
General formulation of long-range degree correlations in complex networks
NASA Astrophysics Data System (ADS)
Fujiki, Yuka; Takaguchi, Taro; Yakubo, Kousuke
2018-06-01
We provide a general framework for analyzing degree correlations between nodes separated by more than one step (i.e., beyond nearest neighbors) in complex networks. One joint and four conditional probability distributions are introduced to fully describe long-range degree correlations with respect to degrees k and k' of two nodes and shortest path length l between them. We present general relations among these probability distributions and clarify the relevance to nearest-neighbor degree correlations. Unlike nearest-neighbor correlations, some of these probability distributions are meaningful only in finite-size networks. Furthermore, as a baseline to determine the existence of intrinsic long-range degree correlations in a network other than inevitable correlations caused by the finite-size effect, the functional forms of these probability distributions for random networks are analytically evaluated within a mean-field approximation. The utility of our argument is demonstrated by applying it to real-world networks.
Stochastic analysis of particle movement over a dune bed
Lee, Baum K.; Jobson, Harvey E.
1977-01-01
Stochastic models are available that can be used to predict the transport and dispersion of bed-material sediment particles in an alluvial channel. These models are based on the proposition that the movement of a single bed-material sediment particle consists of a series of steps of random length separated by rest periods of random duration and, therefore, application of the models requires a knowledge of the probability distributions of the step lengths, the rest periods, the elevation of particle deposition, and the elevation of particle erosion. The procedure was tested by determining distributions from bed profiles formed in a large laboratory flume with a coarse sand as the bed material. The elevation of particle deposition and the elevation of particle erosion can be considered to be identically distributed, and their distribution can be described by either a ' truncated Gaussian ' or a ' triangular ' density function. The conditional probability distribution of the rest period given the elevation of particle deposition closely followed the two-parameter gamma distribution. The conditional probability distribution of the step length given the elevation of particle erosion and the elevation of particle deposition also closely followed the two-parameter gamma density function. For a given flow, the scale and shape parameters describing the gamma probability distributions can be expressed as functions of bed-elevation. (Woodard-USGS)
Wang, Jihan; Yang, Kai
2014-07-01
An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20 min (0.01) to 0.43 min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations followed log-normal distributions, there was some deviation from the true duration distribution of the lists.
Complex damage distribution behaviour in cobalt implanted rutile TiO2 (1 1 0) lattice
NASA Astrophysics Data System (ADS)
Joshi, Shalik Ram; Padmanabhan, B.; Chanda, Anupama; Ojha, Sunil; Kanjilal, D.; Varma, Shikha
2017-11-01
The present work investigates the radiation damage, amorphization and structural modifications that are produced by ion-solid interactions in TiO2 crystals during 200 keV Cobalt ion implantation. RBS/C and GIXRD have been utilized to evaluate the damage in the host lattice as a function of ion fluence. Multiple scattering formalism has been applied to extract the depth dependent damage distributions in TiO2(1 1 0). The results have been compared with the MC simulations performed using SRIM-2013. RBS/C results delineate a buried amorphous layer at a low fluence. Surprisingly, ion induced dynamic activation produces a recovery in this damage at higher fluences. This improvement interestingly occurs only in deep regions (60-300 nm) where a systematic lowering in damage with fluence is observed. Formation of Co-Ti-O phases and generation of stress in TiO2 lattice can also be responsible for this improvement in deep regions. In contrast, surface region (0-60 nm) indicates a gradual increase in damage with fluence. Such a switch in the damage behavior creates a cross point in damage profiles at 60 nm. Surface region is a sink of vacancies whereas deep layers are interstitial rich. However, these regions are far separated from each other resulting in an intermediate (100-150 nm) region with a significant dip (valley) in damage which can be characterized by enhanced recombination of point defects. The damage profiles thus indicate a very complex behavior. MC simulations, however, present very different results. They depict a damage profile that extends to a depth of only 150 nm, which is only about half of the damage- width observed here via RBS/C. Moreover, MC simulations do not indicate presence of any valley like structure in the damage profile. The complex nature of damage distribution observed here via RBS/C may be related to the high ionic nature of the chemical bonds in the TiO2 lattice.
Meta-Analysis of Attitudes toward Damage-Causing Mammalian Wildlife
KANSKY, RUTH; KIDD, MARTIN; KNIGHT, ANDREW T
2014-01-01
Many populations of threatened mammals persist outside formally protected areas, and their survival depends on the willingness of communities to coexist with them. An understanding of the attitudes, and specifically the tolerance, of individuals and communities and the factors that determine these is therefore fundamental to designing strategies to alleviate human-wildlife conflict. We conducted a meta-analysis to identify factors that affected attitudes toward 4 groups of terrestrial mammals. Elephants (65%) elicited the most positive attitudes, followed by primates (55%), ungulates (53%), and carnivores (44%). Urban residents presented the most positive attitudes (80%), followed by commercial farmers (51%) and communal farmers (26%). A tolerance to damage index showed that human tolerance of ungulates and primates was proportional to the probability of experiencing damage while elephants elicited tolerance levels higher than anticipated and carnivores elicited tolerance levels lower than anticipated. Contrary to conventional wisdom, experiencing damage was not always the dominant factor determining attitudes. Communal farmers had a lower probability of being positive toward carnivores irrespective of probability of experiencing damage, while commercial farmers and urban residents were more likely to be positive toward carnivores irrespective of damage. Urban residents were more likely to be positive toward ungulates, elephants, and primates when probability of damage was low, but not when it was high. Commercial and communal farmers had a higher probability of being positive toward ungulates, primates, and elephants irrespective of probability of experiencing damage. Taxonomic bias may therefore be important. Identifying the distinct factors explaining these attitudes and the specific contexts in which they operate, inclusive of the species causing damage, will be essential for prioritizing conservation investments. Meta-Análisis de las Posturas hacia la Mamíferos Silvestres Causantes de Daños Resumen Muchas poblaciones de mamíferos amenazados persisten fuera de áreas protegidas formales y su supervivencia depende de la buena voluntad de las comunidades que coexisten con ellos. Un entendimiento de las posturas, y específicamente de la tolerancia, de los individuos y las comunidades y los factores que los determinan es fundamental para diseñar estrategias que alivien el conflicto humano – vida silvestre. Llevamos a cabo un meta-análisis para identificar los factores que afectaron las posturas hacia cuatro grupos de mamíferos terrestres. Los elefantes (65%) provocaron las posturas más positivas. Los siguieron los primates (55%), los ungulados (53%) y los carnívoros (44%). Los residentes urbanos presentaron las posturas más positivas (80%), seguidos por los granjeros comerciales (51%) y los granjeros comunales (26%). Un índice de tolerancia a los daños mostró que la tolerancia humana a los ungulados y primates fue proporcional a la probabilidad de experimentar daños mientras que los elefantes provocaron niveles de tolerancia más altos de lo esperado y los carnívoros provocaron niveles de tolerancia más bajos de lo esperado. Contrario a la sabiduría convencional, experimentar daños no fue siempre el factor dominante para determinar las posturas. Los granjeros comunales tuvieron una baja probabilidad de ser positivos hacia los carnívoros independientemente de la probabilidad de experimentar daños, mientras que los granjeros comerciales y los residentes urbanos tuvieron mayor probabilidad de ser positivos hacia los carnívoros independientemente de los daños. Los residentes urbanos tuvieron mayor probabilidad de ser positivos hacia los ungulados, los elefantes y los primates cuando la probabilidad de daños fue baja, pero no cuando fue alta. Los granjeros comerciales y comunales tuvieron una mayor probabilidad de ser positivos hacia los ungulados, los primates y los elefantes independientemente de la probabilidad de experimentar daños. El prejuicio taxonómico por eso puede ser importante. El identificar los distintos factores que explican estas posturas y los contextos específicos en los cuales operan, inclusivo de especies que causan daños, será esencial para priorizar las inversiones en la conservación. PMID:24661270
A risk assessment method for multi-site damage
NASA Astrophysics Data System (ADS)
Millwater, Harry Russell, Jr.
This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.
Seismic damage identification for steel structures using distributed fiber optics.
Hou, Shuang; Cai, C S; Ou, Jinping
2009-08-01
A distributed fiber optic monitoring methodology based on optic time domain reflectometry technology is developed for seismic damage identification of steel structures. Epoxy with a strength closely associated to a specified structure damage state is used for bonding zigzagged configured optic fibers on the surfaces of the structure. Sensing the local deformation of the structure, the epoxy modulates the signal change within the optic fiber in response to the damage state of the structure. A monotonic loading test is conducted on a steel specimen installed with the proposed sensing system using selected epoxy that will crack at the designated strain level, which indicates the damage of the steel structure. Then, using the selected epoxy, a varying degree of cyclic loading amplitudes, which is associated with different damage states, is applied on a second specimen. The test results show that the specimen's damage can be identified by the optic sensors, and its maximum local deformation can be recorded by the sensing system; moreover, the damage evolution can also be identified.
7 CFR 252.6 - Miscellaneous provisions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Miscellaneous provisions. (a) Improper distribution or loss of or damage to donated food. If a processor improperly distributes or uses any donated food, or causes loss of or damage to a donated food through its... commodity related funds, assets, or property in child nutrition programs. Whoever embezzles, willfully...
7 CFR 252.6 - Miscellaneous provisions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Miscellaneous provisions. (a) Improper distribution or loss of or damage to donated food. If a processor improperly distributes or uses any donated food, or causes loss of or damage to a donated food through its... commodity related funds, assets, or property in child nutrition programs. Whoever embezzles, willfully...
7 CFR 252.6 - Miscellaneous provisions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Miscellaneous provisions. (a) Improper distribution or loss of or damage to donated food. If a processor improperly distributes or uses any donated food, or causes loss of or damage to a donated food through its... commodity related funds, assets, or property in child nutrition programs. Whoever embezzles, willfully...
7 CFR 252.6 - Miscellaneous provisions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Miscellaneous provisions. (a) Improper distribution or loss of or damage to donated food. If a processor improperly distributes or uses any donated food, or causes loss of or damage to a donated food through its... commodity related funds, assets, or property in child nutrition programs. Whoever embezzles, willfully...
7 CFR 252.6 - Miscellaneous provisions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Miscellaneous provisions. (a) Improper distribution or loss of or damage to donated food. If a processor improperly distributes or uses any donated food, or causes loss of or damage to a donated food through its... commodity related funds, assets, or property in child nutrition programs. Whoever embezzles, willfully...
A tool for simulating collision probabilities of animals with marine renewable energy devices.
Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise
2017-01-01
The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.
Lognormal Approximations of Fault Tree Uncertainty Distributions.
El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P
2018-01-26
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.
Quantum key distribution without the wavefunction
NASA Astrophysics Data System (ADS)
Niestegge, Gerd
A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.
The complexity of divisibility.
Bausch, Johannes; Cubitt, Toby
2016-09-01
We address two sets of long-standing open questions in linear algebra and probability theory, from a computational complexity perspective: stochastic matrix divisibility, and divisibility and decomposability of probability distributions. We prove that finite divisibility of stochastic matrices is an NP-complete problem, and extend this result to nonnegative matrices, and completely-positive trace-preserving maps, i.e. the quantum analogue of stochastic matrices. We further prove a complexity hierarchy for the divisibility and decomposability of probability distributions, showing that finite distribution divisibility is in P, but decomposability is NP-hard. For the former, we give an explicit polynomial-time algorithm. All results on distributions extend to weak-membership formulations, proving that the complexity of these problems is robust to perturbations.
Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.
2002-01-01
The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.
Bayesian alternative to the ISO-GUM's use of the Welch Satterthwaite formula
NASA Astrophysics Data System (ADS)
Kacker, Raghu N.
2006-02-01
In certain disciplines, uncertainty is traditionally expressed as an interval about an estimate for the value of the measurand. Development of such uncertainty intervals with a stated coverage probability based on the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) requires a description of the probability distribution for the value of the measurand. The ISO-GUM propagates the estimates and their associated standard uncertainties for various input quantities through a linear approximation of the measurement equation to determine an estimate and its associated standard uncertainty for the value of the measurand. This procedure does not yield a probability distribution for the value of the measurand. The ISO-GUM suggests that under certain conditions motivated by the central limit theorem the distribution for the value of the measurand may be approximated by a scaled-and-shifted t-distribution with effective degrees of freedom obtained from the Welch-Satterthwaite (W-S) formula. The approximate t-distribution may then be used to develop an uncertainty interval with a stated coverage probability for the value of the measurand. We propose an approximate normal distribution based on a Bayesian uncertainty as an alternative to the t-distribution based on the W-S formula. A benefit of the approximate normal distribution based on a Bayesian uncertainty is that it greatly simplifies the expression of uncertainty by eliminating altogether the need for calculating effective degrees of freedom from the W-S formula. In the special case where the measurand is the difference between two means, each evaluated from statistical analyses of independent normally distributed measurements with unknown and possibly unequal variances, the probability distribution for the value of the measurand is known to be a Behrens-Fisher distribution. We compare the performance of the approximate normal distribution based on a Bayesian uncertainty and the approximate t-distribution based on the W-S formula with respect to the Behrens-Fisher distribution. The approximate normal distribution is simpler and better in this case. A thorough investigation of the relative performance of the two approximate distributions would require comparison for a range of measurement equations by numerical methods.
Does Litter Size Variation Affect Models of Terrestrial Carnivore Extinction Risk and Management?
Devenish-Nelson, Eleanor S.; Stephens, Philip A.; Harris, Stephen; Soulsbury, Carl; Richards, Shane A.
2013-01-01
Background Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. Methodology/Principal Findings We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species – the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. Conclusion/Significance These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes. PMID:23469140
Does litter size variation affect models of terrestrial carnivore extinction risk and management?
Devenish-Nelson, Eleanor S; Stephens, Philip A; Harris, Stephen; Soulsbury, Carl; Richards, Shane A
2013-01-01
Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species - the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes.
Mitra, Rajib; Jordan, Michael I.; Dunbrack, Roland L.
2010-01-01
Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1) input data size and criteria for structure inclusion (resolution, R-factor, etc.); 2) filtering of suspect conformations and outliers using B-factors or other features; 3) secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included); 4) the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5) whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately) have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp. PMID:20442867
Probabilistic analysis of preload in the abutment screw of a dental implant complex.
Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R
2008-09-01
Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.
Comparative analysis through probability distributions of a data set
NASA Astrophysics Data System (ADS)
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
Toxic hepatitis caused by the excretions of the Phyllomedusa bicolor frog - a case report.
Pogorzelska, Joanna; Łapiński, Tadeusz W
2017-03-01
The Kambô ritual consists of various types of skin scarification and subsequent application of Phyllomedusa bicolor secretion to the fresh wounds. In Europe, the ritual of Kambô is becoming more popular, but its use can lead to serious multiple organ damage, sometimes life-threatening. Our manuscript shows a patient with toxic liver damage probably associated with the Kambô ritual.
Toxic hepatitis caused by the excretions of the Phyllomedusa bicolor frog – a case report
Pogorzelska, Joanna
2017-01-01
The Kambô ritual consists of various types of skin scarification and subsequent application of Phyllomedusa bicolor secretion to the fresh wounds. In Europe, the ritual of Kambô is becoming more popular, but its use can lead to serious multiple organ damage, sometimes life-threatening. Our manuscript shows a patient with toxic liver damage probably associated with the Kambô ritual. PMID:28856288
Impact of temporal probability in 4D dose calculation for lung tumors.
Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi
2015-11-08
The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can approximate four-dimensional dose computed using the patient-specific respiratory trace.
NASA Astrophysics Data System (ADS)
Zobin, V. M.; Cruz-Bravo, A. A.; Ventura-Ramírez, F.
2010-06-01
A macroseismic methodology of seismic risk microzonation in a low-rise city based on the vulnerability of residential buildings is proposed and applied to Colima city, Mexico. The seismic risk microzonation for Colima consists of two elements: the mapping of residential blocks according to their vulnerability level and the calculation of an expert-opinion based damage probability matrix (DPM) for a given level of earthquake intensity and a given type of residential block. A specified exposure time to the seismic risk for this zonation is equal to the interval between two destructive earthquakes. The damage probability matrices were calculated for three types of urban buildings and five types of residential blocks in Colima. It was shown that only 9% of 1409 residential blocks are able to resist to the Modify Mercalli (MM) intensity VII and VIII earthquakes without significant damage. The proposed DPM-2007 is in good accordance with the experimental damage curves based on the macroseismic evaluation of 3332 residential buildings in Colima that was carried out after the 21 January 2003 intensity MM VII earthquake. This methodology and the calculated PDM-2007 curves may be applied also to seismic risk microzonation for many low-rise cities in Latin America, Asia, and Africa.
Wei, Chenhui; Zhu, Wancheng; Chen, Shikuo; Ranjith, Pathegama Gamage
2016-01-01
This paper proposes a coupled thermal–hydrological–mechanical damage (THMD) model for the failure process of rock, in which coupling effects such as thermally induced rock deformation, water flow-induced thermal convection, and rock deformation-induced water flow are considered. The damage is considered to be the key factor that controls the THM coupling process and the heterogeneity of rock is characterized by the Weibull distribution. Next, numerical simulations on excavation-induced damage zones in Äspö pillar stability experiments (APSE) are carried out and the impact of in situ stress conditions on damage zone distribution is analysed. Then, further numerical simulations of damage evolution at the heating stage in APSE are carried out. The impacts of in situ stress state, swelling pressure and water pressure on damage evolution at the heating stage are simulated and analysed, respectively. The simulation results indicate that (1) the v-shaped notch at the sidewall of the pillar is predominantly controlled by the in situ stress trends and magnitude; (2) at the heating stage, the existence of confining pressure can suppress the occurrence of damage, including shear damage and tensile damage; and (3) the presence of water flow and water pressure can promote the occurrence of damage, especially shear damage. PMID:28774001
Usami, Masahide; Iwadare, Yoshitaka; Kodaira, Masaki; Watanabe, Kyota; Aoki, Momoko; Katsumi, Chiaki; Matsuda, Kumi; Makino, Kazunori; Iijima, Sonoko; Harada, Maiko; Tanaka, Hiromi; Sasaki, Yoshinori; Tanaka, Tetsuya; Ushijima, Hirokage; Saito, Kazuhiko
2012-01-01
To evaluate relationships between traumatic symptoms and environmental damage conditions among children who survived the 2011 Great East Japan Earthquake and Tsunami. The subjects were 12,524 children in kindergartens, elementary schools, and junior high schools in Ishinomaki City, Miyagi Prefecture, Japan. The Post Traumatic Stress Symptoms for Children 15 items (PTSSC-15), a self-completion questionnaire on traumatic symptoms, was distributed to the children and a questionnaire regarding environmental damage conditions affecting the children was distributed to their teachers. Of 12,524 questionnaires distributed, an effective response was obtained from 11,692 (93.3%). The PTSSC-15 score was significantly higher in females than in males among 4(th) to 6(th) grade students in elementary schools and among junior high school students. In terms of traumatic symptoms and environmental damage conditions, with the exception of kindergartners, children who had their houses damaged or experienced separation from family members had a significantly higher PTSSC-15 score than children who did not experience environmental damage. Except for kindergartners and 4(th)- to 6(th)-grade elementary school students, children who experienced evacuation had a significantly higher PTSSC-15 score. This study demonstrated relationships between traumatic symptoms and environmental damage conditions in children who had suffered from the disaster. Factors examined in studying the relationship between environmental damage conditions and traumatic symptoms were gender, age, house damage, evacuation experience, and bereavement experience. It was critical not only to examine the traumatic symptoms of the children but also to collect accurate information about environmental damage conditions.
Usami, Masahide; Iwadare, Yoshitaka; Kodaira, Masaki; Watanabe, Kyota; Aoki, Momoko; Katsumi, Chiaki; Matsuda, Kumi; Makino, Kazunori; Iijima, Sonoko; Harada, Maiko; Tanaka, Hiromi; Sasaki, Yoshinori; Tanaka, Tetsuya; Ushijima, Hirokage; Saito, Kazuhiko
2012-01-01
Background To evaluate relationships between traumatic symptoms and environmental damage conditions among children who survived the 2011 Great East Japan Earthquake and Tsunami. Methods The subjects were 12,524 children in kindergartens, elementary schools, and junior high schools in Ishinomaki City, Miyagi Prefecture, Japan. The Post Traumatic Stress Symptoms for Children 15 items (PTSSC-15), a self-completion questionnaire on traumatic symptoms, was distributed to the children and a questionnaire regarding environmental damage conditions affecting the children was distributed to their teachers. Of 12,524 questionnaires distributed, an effective response was obtained from 11,692 (93.3%). Results The PTSSC-15 score was significantly higher in females than in males among 4th to 6th grade students in elementary schools and among junior high school students. In terms of traumatic symptoms and environmental damage conditions, with the exception of kindergartners, children who had their houses damaged or experienced separation from family members had a significantly higher PTSSC-15 score than children who did not experience environmental damage. Except for kindergartners and 4th- to 6th-grade elementary school students, children who experienced evacuation had a significantly higher PTSSC-15 score. Conclusions This study demonstrated relationships between traumatic symptoms and environmental damage conditions in children who had suffered from the disaster. Factors examined in studying the relationship between environmental damage conditions and traumatic symptoms were gender, age, house damage, evacuation experience, and bereavement experience. It was critical not only to examine the traumatic symptoms of the children but also to collect accurate information about environmental damage conditions. PMID:23209817
NASA Astrophysics Data System (ADS)
Yamada, Yuhei; Yamazaki, Yoshihiro
2018-04-01
This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.
NASA Astrophysics Data System (ADS)
Sato, Aki-Hiro
2010-12-01
This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.
Net present value probability distributions from decline curve reserves estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, D.E.; Huffman, C.H.; Thompson, R.S.
1995-12-31
This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less
Optimal random search for a single hidden target.
Snider, Joseph
2011-01-01
A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.
On the occurrence of rainstorm damage based on home insurance and weather data
NASA Astrophysics Data System (ADS)
Spekkers, M. H.; Clemens, F. H. L. R.; ten Veldhuis, J. A. E.
2014-08-01
Rainstorm damage caused by malfunctioning of urban drainage systems and water intrusion due to defects in the building envelope can be considerable. Little research on this topic focused on the collection of damage data, the understanding of damage mechanisms and the deepening of data analysis methods. In this paper, the relative contribution of different failure mechanisms to the occurrence of rainstorm damage are investigated, as well as the extent to which these mechanisms relate to weather variables. For a case study in Rotterdam, the Netherlands, a property level home insurance database of around 3100 water-related damage claims was analysed. Records include comprehensive transcripts of communication between insurer, insured and damage assessment experts, which allowed claims to be classified according to their actual damage cause. Results show that roof and wall leakage is the most frequent failure mechanism causing precipitation-related claims, followed by blocked roof gutters, melting snow and sewer flooding. Claims related to sewer flooding were less present in the data, but are associated with significantly larger claim sizes than claims in the majority class, i.e. roof and wall leakages. Rare events logistic regression analysis revealed that maximum rainfall intensity and rainfall volume are significant predictors for the occurrence probability of precipitation-related claims. Moreover, it was found that claims associated with rainfall intensities smaller than 7-8 mm in a 60 min window are mainly related to failures processes in the private domain, such as roof and wall leakages. For rainfall events that exceed the 7-8 mm h-1 threshold, failure of systems in the public domain, such as sewer systems, start to contribute considerably to the overall occurrence probability of claims. The communication transcripts, however, lacked information to be conclusive about to extent to which sewer-related claims were caused by overloading of sewer systems or failure of system components.
On the occurrence of rainstorm damage based on home insurance and weather data
NASA Astrophysics Data System (ADS)
Spekkers, M. H.; Clemens, F. H. L. R.; ten Veldhuis, J. A. E.
2015-02-01
Rainstorm damage caused by the malfunction of urban drainage systems and water intrusion due to defects in the building envelope can be considerable. Little research on this topic focused on the collection of damage data, the understanding of damage mechanisms and the deepening of data analysis methods. In this paper, the relative contribution of different failure mechanisms to the occurrence of rainstorm damage is investigated, as well as the extent to which these mechanisms relate to weather variables. For a case study in Rotterdam, the Netherlands, a property level home insurance database of around 3100 water-related damage claims was analysed. The records include comprehensive transcripts of communication between insurer, insured and damage assessment experts, which allowed claims to be classified according to their actual damage cause. The results show that roof and wall leakage is the most frequent failure mechanism causing precipitation-related claims, followed by blocked roof gutters, melting snow and sewer flooding. Claims related to sewer flooding were less present in the data, but are associated with significantly larger claim sizes than claims in the majority class, i.e. roof and wall leakages. Rare events logistic regression analysis revealed that maximum rainfall intensity and rainfall volume are significant predictors for the occurrence probability of precipitation-related claims. Moreover, it was found that claims associated with rainfall intensities smaller than 7-8 mm in a 60-min window are mainly related to failure processes in the private domain, such as roof and wall leakages. For rainfall events that exceed the 7-8 mm h-1 threshold, the failure of systems in the public domain, such as sewer systems, start to contribute considerably to the overall occurrence probability of claims. The communication transcripts, however, lacked information to be conclusive about to which extent sewer-related claims were caused by overloading of sewer systems or failure of system components.
NASA Astrophysics Data System (ADS)
Mahanti, P.; Robinson, M. S.; Boyd, A. K.
2013-12-01
Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was computed over multiple scales. This slope analysis showed that local slope distributions are non-Gaussian for both crater walls and floors. Over larger baselines (~100 meters), crater wall slope probability distributions do approximate Gaussian distributions better, but have long distribution tails. Crater floor probability distributions however, were always asymmetric (for the baseline scales analyzed) and less affected by baseline scale variations. Accordingly, our results suggest that use of long tailed probability distributions (like Cauchy) and a baseline-dependant multi-scale model can be more effective in describing the slope statistics for lunar topography. Refrences: [1]Moore, H.(1971), JGR,75(11) [2]Marcus, A. H.(1969),JGR,74 (22).[3]R.J. Pike (1970),U.S. Geological Survey Working Paper [4]N. C. Costes, J. E. Farmer and E. B. George (1972),NASA Technical Report TR R-401 [5]M. N. Parker and G. L. Tyler(1973), Radio Science, 8(3),177-184 [6]Alekseev, V. A.et al (1968), Soviet Astronomy, Vol. 11, p.860 [7]Burns et al. (2012) Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B4, 483-488.[8]Smith et al. (2010) GRL 37, L18204, DOI: 10.1029/2010GL043751. [9]Wagner R., Robinson, M., Speyerer E., Mahanti, P., LPSC 2013, #2924.
Probability and the changing shape of response distributions for orientation.
Anderson, Britt
2014-11-18
Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.
NASA Technical Reports Server (NTRS)
Smith, O. E.
1976-01-01
The techniques are presented to derive several statistical wind models. The techniques are from the properties of the multivariate normal probability function. Assuming that the winds can be considered as bivariate normally distributed, then (1) the wind components and conditional wind components are univariate normally distributed, (2) the wind speed is Rayleigh distributed, (3) the conditional distribution of wind speed given a wind direction is Rayleigh distributed, and (4) the frequency of wind direction can be derived. All of these distributions are derived from the 5-sample parameter of wind for the bivariate normal distribution. By further assuming that the winds at two altitudes are quadravariate normally distributed, then the vector wind shear is bivariate normally distributed and the modulus of the vector wind shear is Rayleigh distributed. The conditional probability of wind component shears given a wind component is normally distributed. Examples of these and other properties of the multivariate normal probability distribution function as applied to Cape Kennedy, Florida, and Vandenberg AFB, California, wind data samples are given. A technique to develop a synthetic vector wind profile model of interest to aerospace vehicle applications is presented.
Madenjian, C.P.; Chipman, B.D.; Marsden, J.E.
2008-01-01
Sea lamprey (Petromyzon marinus) control in North America costs millions of dollars each year, and control measures are guided by assessment of lamprey-induced damage to fisheries. The favored prey of sea lamprey in freshwater ecosystems has been lake trout (Salvelinus namaycush). A key parameter in assessing sea lamprey damage, as well as managing lake trout fisheries, is the probability of an adult lake trout surviving a lamprey attack. The conventional value for this parameter has been 0.55, based on laboratory experiments. In contrast, based on catch curve analysis, mark-recapture techniques, and observed wounding rates, we estimated that adult lake trout in Lake Champlain have a 0.74 probability of surviving a lamprey attack. Although sea lamprey growth in Lake Champlain was lower than that observed in Lake Huron, application of an individual-based model to both lakes indicated that the probability of surviving an attack in Lake Champlain was only 1.1 times higher than that in Lake Huron. Thus, we estimated that lake trout survive a lamprey attack in Lake Huron with a probability of 0.66. Therefore, our results suggested that lethality of a sea lamprey attack on lake trout has been overestimated in previous model applications used in fisheries management. ?? 2008 NRC.
Joint probabilities and quantum cognition
NASA Astrophysics Data System (ADS)
de Barros, J. Acacio
2012-12-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Schindler, Dirk; Grebhan, Karin; Albrecht, Axel; Schönborn, Jochen; Kohnle, Ulrich
2012-01-01
Data on storm damage attributed to the two high-impact winter storms 'Wiebke' (28 February 1990) and 'Lothar' (26 December 1999) were used for GIS-based estimation and mapping (in a 50 × 50 m resolution grid) of the winter storm damage probability (P(DAM)) for the forests of the German federal state of Baden-Wuerttemberg (Southwest Germany). The P(DAM)-calculation was based on weights of evidence (WofE) methodology. A combination of information on forest type, geology, soil type, soil moisture regime, and topographic exposure, as well as maximum gust wind speed field was used to compute P(DAM) across the entire study area. Given the condition that maximum gust wind speed during the two storm events exceeded 35 m s(-1), the highest P(DAM) values computed were primarily where coniferous forest grows in severely exposed areas on temporarily moist soils on bunter sandstone formations. Such areas are found mainly in the mountainous ranges of the northern Black Forest, the eastern Forest of Odes, in the Virngrund area, and in the southwestern Alpine Foothills.
Effect of Preconditioning and Soldering on Failures of Chip Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander A.
2014-01-01
Soldering of molded case tantalum capacitors can result in damage to Ta205 dielectric and first turn-on failures due to thermo-mechanical stresses caused by CTE mismatch between materials used in the capacitors. It is also known that presence of moisture might cause damage to plastic cases due to the pop-corning effect. However, there are only scarce literature data on the effect of moisture content on the probability of post-soldering electrical failures. In this work, that is based on a case history, different groups of similar types of CWR tantalum capacitors from two lots were prepared for soldering by bake, moisture saturation, and longterm storage at room conditions. Results of the testing showed that both factors: initial quality of the lot, and preconditioning affect the probability of failures. Baking before soldering was shown to be effective to prevent failures even in lots susceptible to pop-corning damage. Mechanism of failures is discussed and recommendations for pre-soldering bake are suggested based on analysis of moisture characteristics of materials used in the capacitors' design.
Zhuang, Jiancang; Ogata, Yosihiko
2006-04-01
The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elhadj, Selim; Yoo, Jae-hyuck; Negres, Raluca A.
The optical damage performance of electrically conductive gallium nitride (GaN) and indium tin oxide (ITO) films is addressed using large area, high power laser beam exposures at 1064 nm sub-bandgap wavelength. Analysis of the laser damage process assumes that onset of damage (threshold) is determined by the absorption and heating of a nanoscale region of a characteristic size reaching a critical temperature. We use this model to rationalize semi-quantitatively the pulse width scaling of the damage threshold from picosecond to nanosecond timescales, along with the pulse width dependence of the damage threshold probability derived by fitting large beam damage densitymore » data. Multi-shot exposures were used to address lifetime performance degradation described by an empirical expression based on the single exposure damage model. A damage threshold degradation of at least 50% was observed for both materials. Overall, the GaN films tested had 5-10 × higher optical damage thresholds than the ITO films tested for comparable transmission and electrical conductivity. This route to optically robust, large aperture transparent electrodes and power optoelectronics may thus involve use of next generation widegap semiconductors such as GaN.« less
Ragusa-Netto, J
2014-11-01
Seed predation has major effects on the reproductive success of individuals, spatial patterns of populations, genetic variability, interspecific interactions and ultimately in the diversity of tree communities. At a Brazilian savanna, I evaluated the proportional crop loss of Eriotheca gracilipes due the Blue-Fronted Amazon (Amazona aestiva) during a fruiting period. Also, I analyzed the relationship between proportional crop loss to Amazons and both fruit crop size and the distance from the nearest damaged conspecific. Trees produced from 1 to 109 fruits, so that Amazons foraged more often on trees bearing larger fruit crop size, while seldom visited less productive trees. Moreover, the relationship between fruit crop sizes and the number of depredated fruits was significant. However, when only damaged trees were assessed, I found a negative and significant relation between fruit crop size and proportional crop loss to Blue-Fronted Amazons. Taking into account this as a measure more directly related to the probability of seed survival, a negative density dependent effect emerged. Also, Amazons similarly damaged the fruit crops of either close or distant neighboring damaged trees. Hence, in spite of Blue-Fronted Amazons searched for E. gracilipes bearing large fruit crops, they were swamped due to the presence of more fruits than they could eat. Moderate seed predation by Blue-Fronted Amazons either at trees with large fruit crops or in areas where fruiting trees were aggregated implies in an enhanced probability of E. gracilipes seed survival and consequent regeneration success.
Subsurface damage distribution in the lapping process.
Wang, Zhuo; Wu, Yulie; Dai, Yifan; Li, Shengyi
2008-04-01
To systematically investigate the influence of lapping parameters on subsurface damage (SSD) depth and characterize the damage feature comprehensively, maximum depth and distribution of SSD generated in the optical lapping process were measured with the magnetorheological finishing wedge technique. Then, an interaction of adjacent indentations was applied to interpret the generation of maximum depth of SSD. Eventually, the lapping procedure based on the influence of lapping parameters on the material removal rate and SSD depth was proposed to improve the lapping efficiency.
Autonomous diagnostics and prognostics of signal and data distribution systems
NASA Astrophysics Data System (ADS)
Blemel, Kenneth G.
2001-07-01
Wiring is the nervous system of any complex system and is attached to or services nearly every subsystem. Damage to optical wiring systems can cause serious interruptions in communication, command and control systems. Electrical wiring faults and failures due to opens, shorts, and arcing probably result in adverse effects to the systems serviced by the wiring. Abnormalities in a system usually can be detected by monitoring some wiring parameter such as vibration, data activity or power consumption. This paper introduces the mapping of wiring to critical functions during system engineering to automatically define the Failure Modes Effects and Criticality Analysis. This mapping can be used to define the sensory processes needed to perform diagnostics during system engineering. This paper also explains the use of Operational Modes and Criticality Effects Analysis in the development of Sentient Wiring Systems as a means for diagnostic, prognostics and health management of wiring in aerospace and transportation systems.
New nonlinear features for inspection, robotics, and face recognition
NASA Astrophysics Data System (ADS)
Casasent, David P.; Talukder, Ashit
1999-10-01
Classification of real-time X-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non- invasive detection of defective product items on a conveyor belt. We discuss the extraction of new features that allow better discrimination between damaged and clean items (pistachio nuts). This feature extraction and classification stage is the new aspect of this paper; our new maximum representation and discriminating feature (MRDF) extraction method computes nonlinear features that are used as inputs to a new modified k nearest neighbor classifier. In this work, the MRDF is applied to standard features (rather than iconic data). The MRDF is robust to various probability distributions of the input class and is shown to provide good classification and new ROC (receiver operating characteristic) data. Other applications of these new feature spaces in robotics and face recognition are also noted.
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
Optimizing Real-Time Vaccine Allocation in a Stochastic SIR Model
Nguyen, Chantal; Carlson, Jean M.
2016-01-01
Real-time vaccination following an outbreak can effectively mitigate the damage caused by an infectious disease. However, in many cases, available resources are insufficient to vaccinate the entire at-risk population, logistics result in delayed vaccine deployment, and the interaction between members of different cities facilitates a wide spatial spread of infection. Limited vaccine, time delays, and interaction (or coupling) of cities lead to tradeoffs that impact the overall magnitude of the epidemic. These tradeoffs mandate investigation of optimal strategies that minimize the severity of the epidemic by prioritizing allocation of vaccine to specific subpopulations. We use an SIR model to describe the disease dynamics of an epidemic which breaks out in one city and spreads to another. We solve a master equation to determine the resulting probability distribution of the final epidemic size. We then identify tradeoffs between vaccine, time delay, and coupling, and we determine the optimal vaccination protocols resulting from these tradeoffs. PMID:27043931
Histopathology of ventilator-associated pneumonia (VAP) and its clinical implications.
Torres, A; Fábregas, N; Arce, Y; López-Boado, M A
1999-01-01
Ventilator-associated pneumonia (VAP) is a diffuse polymicrobial and dynamic process, with heterogeneous distribution of lesions, showing different degrees of histological evolution predominating in the dependent lung zones, in which microbiology and histology can be dissociated. This might explain why blind endobronchial techniques to collect respiratory secretions have similar accuracy compared to visually guided samples, explaining the difficulties in validating any methods for its diagnosis. In the clinical setting the association of acute lung injury (ALI) and pneumonia is controversial. However, it is rare to detect diffuse alveolar damage (DAD) in absence of histological signs of pneumonia, probably evidencing that ALI favors the development of pneumonia. Histopathologically, it is difficult to distinguish initial and resolution phases of DAD from pneumonia and vice versa. On the other hand, there is a clear relationship between antimicrobial treatment and the decreased lung bacterial burden which strengthens the importance of distal airway sampling before starting antibiotic therapy.
NASA Astrophysics Data System (ADS)
Taylor, Gabriel James
The failure of electrical cables exposed to severe thermal fire conditions are a safety concern for operating commercial nuclear power plants (NPPs). The Nuclear Regulatory Commission (NRC) has promoted the use of risk-informed and performance-based methods for fire protection which resulted in a need to develop realistic methods to quantify the risk of fire to NPP safety. Recent electrical cable testing has been conducted to provide empirical data on the failure modes and likelihood of fire-induced damage. This thesis evaluated numerous aspects of the data. Circuit characteristics affecting fire-induced electrical cable failure modes have been evaluated. In addition, thermal failure temperatures corresponding to cable functional failures have been evaluated to develop realistic single point thermal failure thresholds and probability distributions for specific cable insulation types. Finally, the data was used to evaluate the prediction capabilities of a one-dimension conductive heat transfer model used to predict cable failure.
Assessment of risk due to the use of carbon fiber composites in commercial and general aviation
NASA Technical Reports Server (NTRS)
Fiksel, J.; Rosenfield, D.; Kalelkar, A.
1980-01-01
The development of a national risk profile for the total annual aircraft losses due to carbon fiber composite (CFC) usage through 1993 is discussed. The profile was developed using separate simulation methods for commercial and general aviation aircraft. A Monte Carlo method which was used to assess the risk in commercial aircraft is described. The method projects the potential usage of CFC through 1993, investigates the incidence of commercial aircraft fires, models the potential release and dispersion of carbon fibers from a fire, and estimates potential economic losses due to CFC damaging electronic equipment. The simulation model for the general aviation aircraft is described. The model emphasizes variations in facility locations and release conditions, estimates distribution of CFC released in general aviation aircraft accidents, and tabulates the failure probabilities and aggregate economic losses in the accidents.
Dynamics of combined forest damage risks for 21st century (SRES A1B, B1)
NASA Astrophysics Data System (ADS)
Panferov, Oleg; Merklein, Johannes; Sogachev, Andrey; Junghans, Udo; Ahrends, Bernd
2010-05-01
The ongoing climate change can result in increasing frequency of weather extremes (Leckebusch et al., 2008) which in turn can produce wide area forest damage (windthrows, droughts, insect attacks) within forest ecosystems in Europe. The probability and extent of damage, depend not only on a strength of a driving force itself but especially on combinations of effecting agents and their interactions with forest ecosystem structure and soil properties. The combined effect of several factors which are not the extremes themselves can lead to the biotic and/or abiotic damage so that the combination becomes an extreme event. As soon as a damage event occurs, the forest structure is changed. The changes in forest structure in their turn strengthen or inhibits the influence of different climatic factors thus increase or decrease the probability of the next damage event creating positive or negative feedbacks. To assess the roles of separate meteorological factors and their combinations in forest damage under present and future climatic conditions the coupled model was created in University of Goettingen, as a part of a Decision Support System (Jansen et al, 2008, Panferov et al., 2009). The model combines the 3D ABL Model SCADIS (Panferov and Sogachev, 2008) with modified soil hydrology model BROOK 90 (Federer, 2003, Ahrends et al. 2009) and the model of climate dependent biotic damage. The projected future developments of forest damage events in 21st Century were carried out under conditions of SRES scenarios A1B and B1; the present conditions were evaluated using the measured data of German Weather Service. Climate scenario data of coupled ECHAM5-MPIOM were downscaled by the regional climate model Climate Local Model (CLM) to the spatial resolution of 0.2° x 0.2° and temporal resolution of 24 hours. Using these data as input the small-scale coupled process based modeling was then carried out for example region of Solling, Germany calculating the water and energy balance of forest ecosystems, wind loading on trees and biotic damage for several tree species and typical soil types. The damage risks a certain forest stand at a given soil results from daily combinations of air and soil temperatures, soil water characteristics, static and gust wind loads on trees with dynamic LAI and of soil texture. Some damaged stands show higher vulnerability and thus - positive feedbacks to climate forcing (Vygodskaya et al., 2007). Therefore, changes of microclimate in remaining stands after changes in forest structure are taken into account. Model output is aggregated to 30-years periods and compared to "present conditions" of 1981-2010. The results show considerable increment of both biotic and abiotic risks towards 2100 relatively to "present" caused by weak changes in precipitation and wind patterns and strong increase of mean air temperature and soil temperatures. It is shown, e.g. that the wind- damage-induced changes of structure and microclimate provide a positive feedback i.e. - increase the probability of the next damage event. The study was financed by BMBF within the frames of joint project "Decision Support System - Forest and Climate Change" (DSS-WuK) and by Grant of Ministry for Science and Culture of Lower Saxony "KLIFF". We gratefully acknowledge this support.
Khan, Hafiz; Saxena, Anshul; Perisetti, Abhilash; Rafiq, Aamrin; Gabbidon, Kemesha; Mende, Sarah; Lyuksyutova, Maria; Quesada, Kandi; Blakely, Summre; Torres, Tiffany; Afesse, Mahlet
2016-12-01
Background: Breast cancer is a worldwide public health concern and is the most prevalent type of cancer in women in the United States. This study concerned the best fit of statistical probability models on the basis of survival times for nine state cancer registries: California, Connecticut, Georgia, Hawaii, Iowa, Michigan, New Mexico, Utah, and Washington. Materials and Methods: A probability random sampling method was applied to select and extract records of 2,000 breast cancer patients from the Surveillance Epidemiology and End Results (SEER) database for each of the nine state cancer registries used in this study. EasyFit software was utilized to identify the best probability models by using goodness of fit tests, and to estimate parameters for various statistical probability distributions that fit survival data. Results: Statistical analysis for the summary of statistics is reported for each of the states for the years 1973 to 2012. Kolmogorov-Smirnov, Anderson-Darling, and Chi-squared goodness of fit test values were used for survival data, the highest values of goodness of fit statistics being considered indicative of the best fit survival model for each state. Conclusions: It was found that California, Connecticut, Georgia, Iowa, New Mexico, and Washington followed the Burr probability distribution, while the Dagum probability distribution gave the best fit for Michigan and Utah, and Hawaii followed the Gamma probability distribution. These findings highlight differences between states through selected sociodemographic variables and also demonstrate probability modeling differences in breast cancer survival times. The results of this study can be used to guide healthcare providers and researchers for further investigations into social and environmental factors in order to reduce the occurrence of and mortality due to breast cancer. Creative Commons Attribution License
Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.
2015-01-01
The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637
Ramírez-Zamora, Serafín; Méndez-Rodríguez, Miguel L.; Olguín-Martínez, Marisela; Sánchez-Sevilla, Lourdes; Quintana-Quintana, Miguel; García-García, Norberto; Hernández-Muñoz, Rolando
2013-01-01
Diabetes mellitus (DM) is a worldwide disease characterized by metabolic disturbances, frequently associated with high risk of atherosclerosis and renal and nervous system damage. Here, we assessed whether metabolites reflecting oxidative redox state, arginine and nitric oxide metabolism, are differentially distributed between serum and red blood cells (RBC), and whether significant metabolism of arginine exists in RBC. In 90 patients with type 2 DM without regular treatment for diabetes and 90 healthy controls, paired by age and gender, we measured serum and RBC levels of malondialdehyde (MDA), nitrites, ornithine, citrulline, and urea. In isolated RBC, metabolism of L-[14C]-arginine was also determined. In both groups, nitrites were equally distributed in serum and RBC; citrulline predominated in serum, whereas urea, arginine, and ornithine were found mainly in RBC. DM patients showed hyperglycemia and increased blood HbA1C, and increased levels of these metabolites, except for arginine, significantly correlating with blood glucose levels. RBC were observed to be capable of catabolizing arginine to ornithine, citrulline and urea, which was increased in RBC from DM patients, and correlated with an increased affinity for arginine in the activities of putative RBC arginase (Km = 0.23±0.06 vs. 0.50±0.13 mM, in controls) and nitric oxide synthase (Km = 0.28±0.06 vs. 0.43±0.09 mM, in controls). In conclusion, our results suggest that DM alters metabolite distribution between serum and RBC, demonstrating that RBC regulate serum levels of metabolites which affect nitrogen metabolism, not only by transporting them but also by metabolizing amino acids such as arginine. Moreover, we confirmed that urea can be produced also by human RBC besides hepatocytes, being much more evident in RBC from patients with type 2 DM. These events are probably involved in the specific physiopathology of this disease, i.e., endothelial damage and dysfunction. PMID:23826148
NASA Astrophysics Data System (ADS)
Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia
2016-10-01
Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.
Modified Spectral Fatigue Methods for S-N Curves With MIL-HDBK-5J Coefficients
NASA Technical Reports Server (NTRS)
Irvine, Tom; Larsen, Curtis
2016-01-01
The rainflow method is used for counting fatigue cycles from a stress response time history, where the fatigue cycles are stress-reversals. The rainflow method allows the application of Palmgren-Miner's rule in order to assess the fatigue life of a structure subject to complex loading. The fatigue damage may also be calculated from a stress response power spectral density (PSD) using the semi-empirical Dirlik, Single Moment, Zhao-Baker and other spectral methods. These methods effectively assume that the PSD has a corresponding time history which is stationary with a normal distribution. This paper shows how the probability density function for rainflow stress cycles can be extracted from each of the spectral methods. This extraction allows for the application of the MIL-HDBK-5J fatigue coefficients in the cumulative damage summation. A numerical example is given in this paper for the stress response of a beam undergoing random base excitation, where the excitation is applied separately by a time history and by its corresponding PSD. The fatigue calculation is performed in the time domain, as well as in the frequency domain via the modified spectral methods. The result comparison shows that the modified spectral methods give comparable results to the time domain rainflow counting method.
Passive imaging based multi-cue hazard detection spacecraft safe landing
NASA Technical Reports Server (NTRS)
Huertas, Andres; Cheng, Yang; Madison, Richard
2006-01-01
Accurate assessment of potentially damaging ground hazards during the spacecraft EDL (Entry, Descent and Landing) phase is crucial to insure a high probability of safe landing. A lander that encounters a large rock, falls off a cliff, or tips over on a steep slope can sustain mission ending damage. Guided entry is expected to shrink landing ellipses from 100-300 km to -10 km radius for the second generation landers as early as 2009. Regardless of size and location, however, landing ellipses will almost always contain hazards such as craters, discontinuities, steep slopes, and large rocks. It is estimated that an MSL (Mars Science Laboratory)-sized lander should detect and avoid 16- 150m diameter craters, vertical drops similar to the edges of 16m or 3.75m diameter crater, for high and low altitude HAD (Hazard Detection and Avoidance) respectively. It should also be able to detect slopes 20' or steeper, and rocks 0.75m or taller. In this paper we will present a passive imaging based, multi-cue hazard detection and avoidance (HDA) system suitable for Martian and other lander missions. This is the first passively imaged HDA system that seamlessly integrates multiple algorithm-crater detection, slope estimation, rock detection and texture analysis, and multicues- crater morphology, rock distribution, to detect these hazards in real time.
Phelps, G.A.
2008-01-01
This report describes some simple spatial statistical methods to explore the relationships of scattered points to geologic or other features, represented by points, lines, or areas. It also describes statistical methods to search for linear trends and clustered patterns within the scattered point data. Scattered points are often contained within irregularly shaped study areas, necessitating the use of methods largely unexplored in the point pattern literature. The methods take advantage of the power of modern GIS toolkits to numerically approximate the null hypothesis of randomly located data within an irregular study area. Observed distributions can then be compared with the null distribution of a set of randomly located points. The methods are non-parametric and are applicable to irregularly shaped study areas. Patterns within the point data are examined by comparing the distribution of the orientation of the set of vectors defined by each pair of points within the data with the equivalent distribution for a random set of points within the study area. A simple model is proposed to describe linear or clustered structure within scattered data. A scattered data set of damage to pavement and pipes, recorded after the 1989 Loma Prieta earthquake, is used as an example to demonstrate the analytical techniques. The damage is found to be preferentially located nearer a set of mapped lineaments than randomly scattered damage, suggesting range-front faulting along the base of the Santa Cruz Mountains is related to both the earthquake damage and the mapped lineaments. The damage also exhibit two non-random patterns: a single cluster of damage centered in the town of Los Gatos, California, and a linear alignment of damage along the range front of the Santa Cruz Mountains, California. The linear alignment of damage is strongest between 45? and 50? northwest. This agrees well with the mean trend of the mapped lineaments, measured as 49? northwest.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
USDA-ARS?s Scientific Manuscript database
Armyworm, as a destructive insect for maize, causes wide range of damage in both China and U.S. in recent years. To obtain the spatial distribution of damage area and assess the damage severity, a fast and accurate loss assessment method is of great importance for effective management. This study, t...
On probability-possibility transformations
NASA Technical Reports Server (NTRS)
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Theoretical size distribution of fossil taxa: analysis of a null model.
Reed, William J; Hughes, Barry D
2007-03-22
This article deals with the theoretical size distribution (of number of sub-taxa) of a fossil taxon arising from a simple null model of macroevolution. New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions) or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. The size distributions of the pioneering genus (following a cataclysm) and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family.
Newton/Poisson-Distribution Program
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Scheuer, Ernest M.
1990-01-01
NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.
Damage identification in highway bridges using distribution factors
NASA Astrophysics Data System (ADS)
Gangone, Michael V.; Whelan, Matthew J.
2017-04-01
The U.S. infrastructure system is well behind the needs of the 21st century and in dire need of improvements. The American Society of Civil Engineers (ASCE) graded America's Infrastructure as a "D+" in its recent 2013 Report Card. Bridges are a major component of the infrastructure system and were awarded a "C+". Nearly 25 percent of the nation's bridges are categorized as deficient by the Federal Highway Administration (FWHA). Most bridges were designed with an expected service life of roughly 50 years and today the average age of a bridge is 42 years. Finding alternative methods of condition assessment which captures the true performance of the bridge is of high importance. This paper discusses the monitoring of two multi-girder/stringer bridges at different ages of service life. Normal strain measurements were used to calculate the load distribution factor at the midspan of the bridge under controlled loading conditions. Controlled progressive damage was implemented to one of the superstructures to determine if the damage could be detected using the distribution factor. An uncertainty analysis, based on the accuracy and precision of the normal strain measurement, was undertaken to determine how effective it is to use the distribution factor measurement as a damage indicator. The analysis indicates that this load testing parameter may be an effective measure for detecting damage.
Probability theory for 3-layer remote sensing radiative transfer model: univariate case.
Ben-David, Avishai; Davidson, Charles E
2012-04-23
A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America
Sampling probability distributions of lesions in mammograms
NASA Astrophysics Data System (ADS)
Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.
2015-03-01
One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.
NASA Technical Reports Server (NTRS)
Schuecker, Clara; Davila, Carlos G.; Rose, Cheryl A.
2010-01-01
Five models for matrix damage in fiber reinforced laminates are evaluated for matrix-dominated loading conditions under plane stress and are compared both qualitatively and quantitatively. The emphasis of this study is on a comparison of the response of embedded plies subjected to a homogeneous stress state. Three of the models are specifically designed for modeling the non-linear response due to distributed matrix cracking under homogeneous loading, and also account for non-linear (shear) behavior prior to the onset of cracking. The remaining two models are localized damage models intended for predicting local failure at stress concentrations. The modeling approaches of distributed vs. localized cracking as well as the different formulations of damage initiation and damage progression are compared and discussed.
Crack surface roughness in three-dimensional random fuse networks
NASA Astrophysics Data System (ADS)
Nukala, Phani Kumar V. V.; Zapperi, Stefano; Šimunović, Srđan
2006-08-01
Using large system sizes with extensive statistical sampling, we analyze the scaling properties of crack roughness and damage profiles in the three-dimensional random fuse model. The analysis of damage profiles indicates that damage accumulates in a diffusive manner up to the peak load, and localization sets in abruptly at the peak load, starting from a uniform damage landscape. The global crack width scales as Wtilde L0.5 and is consistent with the scaling of localization length ξ˜L0.5 used in the data collapse of damage profiles in the postpeak regime. This consistency between the global crack roughness exponent and the postpeak damage profile localization length supports the idea that the postpeak damage profile is predominantly due to the localization produced by the catastrophic failure, which at the same time results in the formation of the final crack. Finally, the crack width distributions can be collapsed for different system sizes and follow a log-normal distribution.
Engineering Design Handbook: Recoilless Rifle Weapon Systems.
1976-01-15
rifles also creates danger of physiological damage also exists. psychological factors which may be signifi- The extent and nature of the damage caused...the intensity of the bla&st, its disruption often cause psychological effects impulse, the position of the subject with of extreme lethargy and...General. 6. AD 3S 1 905, Capt. L. R. Creelman , A 2. David E. Walters and Edith F. Reilly, Parameiric Study of the Probability of Hit Hitting
When Violence Pays: A Cost-Benefit Analysis of Aggressive Behavior in Animals and Humans
Georgiev, Alexander V.; Klimczuk, Amanda C. E.; Traficonte, Daniel M.
2013-01-01
An optimization analysis of human behavior from a comparative perspective can improve our understanding of the adaptiveness of human nature. Intra-specific competition for resources provides the main selective pressure for the evolution of violent aggression toward conspecifics, and variation in the fitness benefits and costs of aggression can account for inter-specific and inter-individual differences in aggressiveness. When aggression reflects competition for resources, its benefits vary in relation to the characteristics of the resources (their intrinsic value, abundance, spatial distribution, and controllability) while its costs vary in relation to the characteristics of organisms and how they fight (which, in turn, affects the extent to which aggression entails risk of physical injury or death, energetic depletion, exposure to predation, psychological and physiological stress, or damage to social relationships). Humans are a highly aggressive species in comparison to other animals, probably as a result of an unusually high benefit-to-cost ratio for intra-specific aggression. This conclusion is supported by frequent and widespread occurrence of male-male coalitionary killing and by male-female sexual coercion. Sex differences in violent aggression in humans and other species probably evolved by sexual selection and reflect different optimal competitive strategies for males and females. PMID:23864299
NASA Astrophysics Data System (ADS)
Petrie, E. S.; Evans, J. P.; Richey, D.; Flores, S.; Barton, C.; Mozley, P.
2015-12-01
Sedimentary rocks in the San Rafael Swell, Utah, were deformed by Laramide compression and subsequent Neogene extension. We evaluate the effect of fault damage zone morphology as a function of structural position, and changes in mechanical stratigraphy on the distribution of secondary minerals across the reservoir-seal pair of the Navajo Sandstone and overlying Carmel Formation. We decipher paleo-fluid migration and examine the effect faults and fractures have on reservoir permeability and efficacy of top seal for a range of geo-engineering applications. Map-scale faults have an increased probability of allowing upward migration of fluids along the fault plane and within the damage zone, potentially bypassing the top seal. Field mapping, mesoscopic structural analyses, petrography, and geochemical observations demonstrate that fault zone thickness increases at structural intersections, fault relay zones, fault-related folds, and fault tips. Higher densities of faults with meters of slip and dense fracture populations are present in relay zones relative to single, discrete faults. Curvature analysis of the San Rafael monocline and fracture density data show that fracture density is highest where curvature is highest in the syncline hinge and near faults. Fractures cross the reservoir-seal interface where fracture density is highest and structural diagensis includes mineralization events and bleaching and calcite and gypsum mineralization. The link between fracture distributions and structural setting implys that transmissive fractures have predictable orientations and density distributions. At the m- to cm- scale, deformation-band faults and joints in the Navajo Sandstone penetrate the reservoir-seal interface and transition into open-mode fractures in the caprock seal. Scanline analysis and petrography of veins provide evidence for subsurface mineralization and fracture reactivation, suggesting that the fractures act as loci for fluid flow through time. Heterolithic caprock seals with variable fracture distributions and morphology highlight the strong link between the variation in material properties and the response to changing stress conditions. The variable connectivity of fractures and the changes in fracture density plays a critical role in subsurface fluid flow.
The debt of nations and the distribution of ecological impacts from human activities
Srinivasan, U. Thara; Carey, Susan P.; Hallstein, Eric; Higgins, Paul A. T.; Kerr, Amber C.; Koteen, Laura E.; Smith, Adam B.; Watson, Reg; Harte, John; Norgaard, Richard B.
2008-01-01
As human impacts to the environment accelerate, disparities in the distribution of damages between rich and poor nations mount. Globally, environmental change is dramatically affecting the flow of ecosystem services, but the distribution of ecological damages and their driving forces has not been estimated. Here, we conservatively estimate the environmental costs of human activities over 1961–2000 in six major categories (climate change, stratospheric ozone depletion, agricultural intensification and expansion, deforestation, overfishing, and mangrove conversion), quantitatively connecting costs borne by poor, middle-income, and rich nations to specific activities by each of these groups. Adjusting impact valuations for different standards of living across the groups as commonly practiced, we find striking imbalances. Climate change and ozone depletion impacts predicted for low-income nations have been overwhelmingly driven by emissions from the other two groups, a pattern also observed for overfishing damages indirectly driven by the consumption of fishery products. Indeed, through disproportionate emissions of greenhouse gases alone, the rich group may have imposed climate damages on the poor group greater than the latter's current foreign debt. Our analysis provides prima facie evidence for an uneven distribution pattern of damages across income groups. Moreover, our estimates of each group's share in various damaging activities are independent from controversies in environmental valuation methods. In a world increasingly connected ecologically and economically, our analysis is thus an early step toward reframing issues of environmental responsibility, development, and globalization in accordance with ecological costs. PMID:18212119
How to model a negligible probability under the WTO sanitary and phytosanitary agreement?
Powell, Mark R
2013-06-01
Since the 1997 EC--Hormones decision, World Trade Organization (WTO) Dispute Settlement Panels have wrestled with the question of what constitutes a negligible risk under the Sanitary and Phytosanitary Agreement. More recently, the 2010 WTO Australia--Apples Panel focused considerable attention on the appropriate quantitative model for a negligible probability in a risk assessment. The 2006 Australian Import Risk Analysis for Apples from New Zealand translated narrative probability statements into quantitative ranges. The uncertainty about a "negligible" probability was characterized as a uniform distribution with a minimum value of zero and a maximum value of 10(-6) . The Australia - Apples Panel found that the use of this distribution would tend to overestimate the likelihood of "negligible" events and indicated that a triangular distribution with a most probable value of zero and a maximum value of 10⁻⁶ would correct the bias. The Panel observed that the midpoint of the uniform distribution is 5 × 10⁻⁷ but did not consider that the triangular distribution has an expected value of 3.3 × 10⁻⁷. Therefore, if this triangular distribution is the appropriate correction, the magnitude of the bias found by the Panel appears modest. The Panel's detailed critique of the Australian risk assessment, and the conclusions of the WTO Appellate Body about the materiality of flaws found by the Panel, may have important implications for the standard of review for risk assessments under the WTO SPS Agreement. © 2012 Society for Risk Analysis.
Purcell, Jeremy J.; Rapp, Brenda
2013-01-01
Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981
Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.
Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.
Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions
Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.
2017-03-27
Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.
[Computer diagnosis of traumatic impact by hepatic lesion].
Kimbar, V I; Sevankeev, V V
2007-01-01
A method of computer-assisted diagnosis of traumatic affection by liver damage (HEPAR-test program) is described. The program is based on calculated diagnostic coefficients using Bayes' probability method with Wald's recognition procedure.
ERIC Educational Resources Information Center
Duerdoth, Ian
2009-01-01
The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…
Dose-rate effect of ultrashort electron beam radiation on DNA damage and repair in vitro.
Babayan, Nelly; Hovhannisyan, Galina; Grigoryan, Bagrat; Grigoryan, Ruzanna; Sarkisyan, Natalia; Tsakanova, Gohar; Haroutiunian, Samvel; Aroutiounian, Rouben
2017-11-01
Laser-generated electron beams are distinguished from conventional accelerated particles by ultrashort beam pulses in the femtoseconds to picoseconds duration range, and their application may elucidate primary radiobiological effects. The aim of the present study was to determine the dose-rate effect of laser-generated ultrashort pulses of 4 MeV electron beam radiation on DNA damage and repair in human cells. The dose rate was increased via changing the pulse repetition frequency, without increasing the electron energy. The human chronic myeloid leukemia K-562 cell line was used to estimate the DNA damage and repair after irradiation, via the comet assay. A distribution analysis of the DNA damage was performed. The same mean level of initial DNA damages was observed at low (3.6 Gy/min) and high (36 Gy/min) dose-rate irradiation. In the case of low-dose-rate irradiation, the detected DNA damages were completely repairable, whereas the high-dose-rate irradiation demonstrated a lower level of reparability. The distribution analysis of initial DNA damages after high-dose-rate irradiation revealed a shift towards higher amounts of damage and a broadening in distribution. Thus, increasing the dose rate via changing the pulse frequency of ultrafast electrons leads to an increase in the complexity of DNA damages, with a consequent decrease in their reparability. Since the application of an ultrashort pulsed electron beam permits us to describe the primary radiobiological effects, it can be assumed that the observed dose-rate effect on DNA damage/repair is mainly caused by primary lesions appearing at the moment of irradiation. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hang, E-mail: hangchen@mit.edu; Thill, Peter; Cao, Jianshu
In biochemical systems, intrinsic noise may drive the system switch from one stable state to another. We investigate how kinetic switching between stable states in a bistable network is influenced by dynamic disorder, i.e., fluctuations in the rate coefficients. Using the geometric minimum action method, we first investigate the optimal transition paths and the corresponding minimum actions based on a genetic toggle switch model in which reaction coefficients draw from a discrete probability distribution. For the continuous probability distribution of the rate coefficient, we then consider two models of dynamic disorder in which reaction coefficients undergo different stochastic processes withmore » the same stationary distribution. In one, the kinetic parameters follow a discrete Markov process and in the other they follow continuous Langevin dynamics. We find that regulation of the parameters modulating the dynamic disorder, as has been demonstrated to occur through allosteric control in bistable networks in the immune system, can be crucial in shaping the statistics of optimal transition paths, transition probabilities, and the stationary probability distribution of the network.« less
NASA Astrophysics Data System (ADS)
Jenkins, Colleen; Jordan, Jay; Carlson, Jeff
2007-02-01
This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.; Branagan, P.; Sattler, A.R.
This paper is a case study of the stimulation and testing of tight lenticular sands in the paludal interval of the Mesaverde Group in the Piceance basin as DOE's Multiwell Experiment site in Colorado. Prestimulation data acquired include (1) geologic studies which delineate the size and shape of the lenses, (2) detailed core reservoir/rock property data, (3) stress test data showing the vertical distribution of stress throgh the interval, (4) drawdown, buildup, and interference tests which provide in situ reservoir properties, and (5) laboratory data on frac fluid invasion and damage. Stimulation of the zone was conducted in two phases.more » The first phase consisted of steprate/flowback tests and two minifracs in which pre-frac design information was obtained. The main stimulation was the second phase and it consisted of 75,000 gal (284 m/sup 3/) of cross-linked HPG and 193,000 lbs. (87,545 kg) of sand. Bottomhole pressure and temperature, as well as all surface data, were obtained and borehole geophones were used for fracture diagnostics. Detailed pre-frac, between-frac, and post-frac well test programs were conducted in conjunction with the stimulation. Severe cleanup problems and remedial actions are documented. The analyses and diagnostics give some indication of fracture geometry in this lenticular environment. Complications such as high treatment pressures and frac-fluid damage are detailed. Gas production was decreased (at least over short test periods) by both phases of the treatment; this was probably due to damage to the natural fractures which did not have sufficient time to clean up. 14 references, 10 figures, 1 table.« less
Modelling electron distributions within ESA's Gaia satellite CCD pixels to mitigate radiation damage
NASA Astrophysics Data System (ADS)
Seabroke, G. M.; Holland, A. D.; Burt, D.; Robbins, M. S.
2009-08-01
The Gaia satellite is a high-precision astrometry, photometry and spectroscopic ESA cornerstone mission, currently scheduled for launch in 2012. Its primary science drivers are the composition, formation and evolution of the Galaxy. Gaia will achieve its unprecedented positional accuracy requirements with detailed calibration and correction for radiation damage. At L2, protons cause displacement damage in the silicon of CCDs. The resulting traps capture and emit electrons from passing charge packets in the CCD pixel, distorting the image PSF and biasing its centroid. Microscopic models of Gaia's CCDs are being developed to simulate this effect. The key to calculating the probability of an electron being captured by a trap is the 3D electron density within each CCD pixel. However, this has not been physically modelled for the Gaia CCD pixels. In Seabroke, Holland & Cropper (2008), the first paper of this series, we motivated the need for such specialised 3D device modelling and outlined how its future results will fit into Gaia's overall radiation calibration strategy. In this paper, the second of the series, we present our first results using Silvaco's physics-based, engineering software: the ATLAS device simulation framework. Inputting a doping profile, pixel geometry and materials into ATLAS and comparing the results to other simulations reveals that ATLAS has a free parameter, fixed oxide charge, that needs to be calibrated. ATLAS is successfully benchmarked against other simulations and measurements of a test device, identifying how to use it to model Gaia pixels and highlighting the affect of different doping approximations.
A new method to assess damage to RCMRFs from period elongation and Park-Ang damage index using IDA
NASA Astrophysics Data System (ADS)
Aghagholizadeh, Mehrdad; Massumi, Ali
2016-09-01
Despite a significant progress in loading and design codes of seismic resistant structures and technology improvements in building structures, the field of civil engineering is still facing critical challenges. An example of those challenges is the assessment of the state of damage that has been imposed to a structure after earthquakes of different intensities. To determine the operability of a structure and its resistance to probable future earthquakes, quick assessment of damages and determining the operability of a structure after an earthquake are crucial. Present methods to calculate damage to structures are time consuming and do not accurately provide the rate of damage. Damage estimation is important task in the fields of structural health monitoring and decision-making. This study examines the relationship between period elongation and the Park-Ang damage index. A dynamic non-linear analysis is employed with IDARC program to calculate the amount of damage and period of the current state. This new method is shown to be a quick and accurate technique for damage assessment. It is easy to calculate the period of an existing structure and changes in the period which reflects changes in the stiffness matrix.
Elhadj, Selim; Yoo, Jae-hyuck; Negres, Raluca A.; ...
2016-12-19
The optical damage performance of electrically conductive gallium nitride (GaN) and indium tin oxide (ITO) films is addressed using large area, high power laser beam exposures at 1064 nm sub-bandgap wavelength. Analysis of the laser damage process assumes that onset of damage (threshold) is determined by the absorption and heating of a nanoscale region of a characteristic size reaching a critical temperature. We use this model to rationalize semi-quantitatively the pulse width scaling of the damage threshold from picosecond to nanosecond timescales, along with the pulse width dependence of the damage threshold probability derived by fitting large beam damage densitymore » data. Multi-shot exposures were used to address lifetime performance degradation described by an empirical expression based on the single exposure damage model. A damage threshold degradation of at least 50% was observed for both materials. Overall, the GaN films tested had 5-10 × higher optical damage thresholds than the ITO films tested for comparable transmission and electrical conductivity. This route to optically robust, large aperture transparent electrodes and power optoelectronics may thus involve use of next generation widegap semiconductors such as GaN.« less
p-adic stochastic hidden variable model
NASA Astrophysics Data System (ADS)
Khrennikov, Andrew
1998-03-01
We propose stochastic hidden variables model in which hidden variables have a p-adic probability distribution ρ(λ) and at the same time conditional probabilistic distributions P(U,λ), U=A,A',B,B', are ordinary probabilities defined on the basis of the Kolmogorov measure-theoretical axiomatics. A frequency definition of p-adic probability is quite similar to the ordinary frequency definition of probability. p-adic frequency probability is defined as the limit of relative frequencies νn but in the p-adic metric. We study a model with p-adic stochastics on the level of the hidden variables description. But, of course, responses of macroapparatuses have to be described by ordinary stochastics. Thus our model describes a mixture of p-adic stochastics of the microworld and ordinary stochastics of macroapparatuses. In this model probabilities for physical observables are the ordinary probabilities. At the same time Bell's inequality is violated.
Hughes, Kevin A
2003-08-01
Factors affecting fecal microorganism survival and distribution in the Antarctic marine environment include solar radiation, water salinity, temperature, sea ice conditions, and fecal input by humans and local wildlife populations. This study assessed the influence of these factors on the distribution of presumptive fecal coliforms around Rothera Point, Adelaide Island, Antarctic Peninsula during the austral summer and winter of February 1999 to September 1999. Each factor had a different degree of influence depending on the time of year. In summer (February), although the station population was high, presumptive fecal coliform concentrations were low, probably due to the biologically damaging effects of solar radiation. However, summer algal blooms reduced penetration of solar radiation into the water column. By early winter (April), fecal coliform concentrations were high, due to increased fecal input by migrant wildlife, while solar radiation doses were low. By late winter (September), fecal coliform concentrations were high near the station sewage outfall, as sea ice formation limited solar radiation penetration into the sea and prevented wind-driven water circulation near the outfall. During this study, environmental factors masked the effect of station population numbers on sewage plume size. If sewage production increases throughout the Antarctic, environmental factors may become less significant and effective sewage waste management will become increasingly important. These findings highlight the need for year-round monitoring of fecal coliform distribution in Antarctic waters near research stations to produce realistic evaluations of sewage pollution persistence and dispersal.
Snow fracture: From micro-cracking to global failure
NASA Astrophysics Data System (ADS)
Capelli, Achille; Reiweger, Ingrid; Schweizer, Jürg
2017-04-01
Slab avalanches are caused by a crack forming and propagating in a weak layer within the snow cover, which eventually causes the detachment of the overlying cohesive slab. The gradual damage process leading to the nucleation of the initial failure is still not entirely understood. Therefore, we studied the damage process preceding snow failure by analyzing the acoustic emissions (AE) generated by bond failure or micro-cracking. The AE allow studying the ongoing progressive failure in a non-destructive way. We performed fully load-controlled failure experiments on snow samples presenting a weak layer and recorded the generated AE. The size and frequency of the generated AE increased before failure revealing an acceleration of the damage process with increased size and frequency of damage and/or microscopic cracks. The AE energy was power-law distributed and the exponent (b-value) decreased approaching failure. The waiting time followed an exponential distribution with increasing exponential coefficient λ before failure. The decrease of the b-value and the increase of λ correspond to a change in the event distribution statistics indicating a transition from homogeneously distributed uncorrelated damage producing mostly small AE to localized damage, which cause larger correlated events which leads to brittle failure. We observed brittle failure for the fast experiment and a more ductile behavior for the slow experiments. This rate dependence was reflected also in the AE signature. In the slow experiments the b value and λ were almost constant, and the energy rate increase was moderate indicating that the damage process was in a stable state - suggesting the damage and healing processes to be balanced. On a shorter time scale, however, the AE parameters varied indicating that the damage process was not steady but consisted of a sum of small bursts. We assume that the bursts may have been generated by cascades of correlated micro-cracks caused by localization of stresses at a small scale. The healing process may then have prevented the self-organization of this small scale damage and, therefore, the total failure of the sample.
Stationary properties of maximum-entropy random walks.
Dixit, Purushottam D
2015-10-01
Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.
NASA Astrophysics Data System (ADS)
Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor
2017-04-01
Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.
Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S
2010-05-21
Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Rane, Shruti; Caroselli, Jerome Silvio; Dickinson, Mercedes; Tran, Kim; Kuang, Fanny; Hiscock, Merrill
2016-01-01
The Trail Making Test (TMT), a widely used neuropsychological test, is highly effective in detecting brain damage. A shortcoming of the test is that it requires drawing lines and thus is impractical for use with persons suffering manual impairment. The 3 studies described herein were designed to describe and evaluate a nonmanual Trail Making Test (NMTMT) that would be suitable for use with manually impaired individuals. The NMTMT utilizes color to permit oral reporting of the stimuli constituting a series of numbers (Part A) or alternating series of numbers and letters (Part B). The studies, which involved a total of 200 university students, indicate that the standard TMT and the NMTMT are moderately related to each other and have similar patterns of association and nonassociation with other neuropsychological measures. Participants with scores falling near the bottom of the NMTMT distribution have a high probability of scoring at least 1 standard deviation below the mean of the TMT distribution for Part B. The clinically important relationship of Part A to Part B seems to be retained in the NMTMT. It is concluded that the NMTMT shows promise as a substitute for the TMT when the TMT cannot be used.
de la Fuente, Vicenta; Rodríguez, Nuria; Amils, Ricardo
2012-05-01
Ferritin is of interest at the structural and functional level not only as storage for iron, a critical element, but also as a means to prevent cell damage produced by oxidative stress. The main objective of this work was to confirm by immunocytochemistry the presence and the subcellular distribution of the ferritin detected by Mösbauer spectroscopy in Imperata cylindrica, a plant which accumulates large amounts of iron. The localization of ferritin was performed in epidermal, parenchymal and vascular tissues of shoots and leaves of I. cylindrica. The highest density of immunolabeling in shoots appeared in the intracellular space of cell tissues, near the cell walls and in the cytoplasm. In leaves, ferritin was detected in the proximity of the dense network of the middle lamella of cell walls, following a similar path to that observed in shoots. Immunolabeling was also localized in chloroplasts. The abundance of immunogold labelling in mitochondria for I. cylindrica was rather low, probably because the study dealt with tissues from old plants. These results further expand the localization of ferritin in cell components other than chloroplasts and mitochondria in plants. Copyright © 2011 Elsevier GmbH. All rights reserved.
Distribution of causes of infertility in patients attending primary fertility clinics in Israel.
Farhi, Jacob; Ben-Haroush, Avi
2011-01-01
Infertility is one of the most prevalent health disorders in young adults. To study the distribution of causes of infertility in couples referred to primary infertility clinics in Israel. Data for a 9 year period were derived from two clinics of major women's hospitals run by the country's largest health insurance fund. All patients were treated by one physician. Laparoscopy was not performed to rule out endometriosis. Of the 2515 couples identified, 1991 (79.2%) had a definitive diagnosis following complete workup (including hysterosalpingography). Mean age was 29.6 +/- 6.0 years; mean duration of infertility was 1.7 +/- 1.8 years. Primary infertility accounted for 65% of cases. Causes of infertility were male factor (45%), oligo-ovulation disorders (37%), and tubal damage (18%). Infertility factors were identified in the woman alone in 30.6% of cases and the man alone in 29.2%. Two combined infertility factors were found in 18% of patients, and three combined factors in 0.5%. The rate of unexplained infertility (which probably includes non-tubal endometriosis) was 20.7%. As male factor accounts for almost half of all cases of infertility in couples, sperm analysis is mandatory before any treatment.
NASA Astrophysics Data System (ADS)
Mandal, S.; Choudhury, B. U.
2015-07-01
Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.
NASA Astrophysics Data System (ADS)
Osumi, Ayumu; Ito, Youichi
2012-05-01
A fire site holds important information about the cause of fire outbreak; for instance, a concrete wall can provide a wealth of information and the distribution of fire damage of the wall is particularly valuable. If the distribution of fire damage on concrete walls can be used to trace the flow of fire, it would be possible to identify the fire origin and to clarify the cause of fire outbreak. In this study, we considered a new method based on aerial ultrasonic waves and developed a system that adopts this method for detecting fire damage of concrete walls at fire sites.
NASA Astrophysics Data System (ADS)
Lee, Jaeha; Tsutsui, Izumi
2017-05-01
We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.
Two Universality Properties Associated with the Monkey Model of Zipf's Law
NASA Astrophysics Data System (ADS)
Perline, Richard; Perline, Ron
2016-03-01
The distribution of word probabilities in the monkey model of Zipf's law is associated with two universality properties: (1) the power law exponent converges strongly to $-1$ as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on $[0,1]$; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem for the logarithm of sample spacings from Shao and Hahn, and the second property follows from Anscombe's central limit theorem for a random number of i.i.d. random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.
1983-01-01
Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)