Sample records for probabilistic damage stability

  1. Application Side Casing on Open Deck RoRo to Improve Ship Stability

    NASA Astrophysics Data System (ADS)

    Hasanudin; K. A. P Utama, I.; Chen, Jeng-Horng

    2018-03-01

    RoRo is a vessel that can transport passengers, cargo, container and cars. Open Car Deck is favourite RoRo Vessel in developing countries due to its small GT, small tax and spacious car deck, but it has poor survival of stability. Many accident involve Open Car Deck RoRo which cause fatalities and victim. In order to ensure the safety of the ship, IMO had applied intact stability criteria IS Code 2008 which adapted from Rahola’s Research, but since 2008 IMO improved criteria become probabilistic damage stability SOLAS 2009. The RoRo type Open Car Deck has wide Breadth (B), small Draft (D) and small freeboard. It has difficulties to satisfy the ship’s stability criteria. Side Casings which has been applied in some RoRo have be known reduce freeboard or improve ship’s safety. In this paper investigated the effect side casings to survival of intact dan damage ship’s stability. Calculation has been conducted for four ships without, existing and full side casings. The investigation results shows that defect stability of Open Deck RoRo can be reduce with fitting side casing.

  2. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  3. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  4. Damage prognosis of adhesively-bonded joints in laminated composite structural components of unmanned aerial vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, Charles R; Gobbato, Maurizio; Conte, Joel

    2009-01-01

    The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current statemore » of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.« less

  5. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canavan, G.H.

    This note studies the impact of maximizing the stability index rather than minimizing the first strike cost in choosing offensive missile allocations. It does so in the context of a model in which exchanges between vulnerable missile forces are modeled probabilistically, converted into first and second strike costs through approximations to the value target sets at risk, and the stability index is taken to be their ratio. The value of the allocation that minimizes the first strike cost for both attack preferences are derived analytically. The former recovers results derived earlier. The latter leads to an optimum at unity allocationmore » for which the stability index is determined analytically. For values of the attack preference greater than about unity, maximizing the stability index increases the cost of striking first 10--15%. For smaller values of the attack preference, maximizing the index increases the second strike cost a similar amount. Both are stabilizing, so if both sides could be trusted to target on missiles in order to minimize damage to value and maximize stability, the stability index for vulnerable missiles could be increased by about 15%. However, that would increase the cost to the first striker by about 15%. It is unclear why--having decided to strike--he would do so in a way that would increase damage to himself.« less

  7. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  8. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  9. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  10. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  11. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  12. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  13. Acoustic emission based damage localization in composites structures using Bayesian identification

    NASA Astrophysics Data System (ADS)

    Kundu, A.; Eaton, M. J.; Al-Jumali, S.; Sikdar, S.; Pullin, R.

    2017-05-01

    Acoustic emission based damage detection in composite structures is based on detection of ultra high frequency packets of acoustic waves emitted from damage sources (such as fibre breakage, fatigue fracture, amongst others) with a network of distributed sensors. This non-destructive monitoring scheme requires solving an inverse problem where the measured signals are linked back to the location of the source. This in turn enables rapid deployment of mitigative measures. The presence of significant amount of uncertainty associated with the operating conditions and measurements makes the problem of damage identification quite challenging. The uncertainties stem from the fact that the measured signals are affected by the irregular geometries, manufacturing imprecision, imperfect boundary conditions, existing damages/structural degradation, amongst others. This work aims to tackle these uncertainties within a framework of automated probabilistic damage detection. The method trains a probabilistic model of the parametrized input and output model of the acoustic emission system with experimental data to give probabilistic descriptors of damage locations. A response surface modelling the acoustic emission as a function of parametrized damage signals collected from sensors would be calibrated with a training dataset using Bayesian inference. This is used to deduce damage locations in the online monitoring phase. During online monitoring, the spatially correlated time data is utilized in conjunction with the calibrated acoustic emissions model to infer the probabilistic description of the acoustic emission source within a hierarchical Bayesian inference framework. The methodology is tested on a composite structure consisting of carbon fibre panel with stiffeners and damage source behaviour has been experimentally simulated using standard H-N sources. The methodology presented in this study would be applicable in the current form to structural damage detection under varying operational loads and would be investigated in future studies.

  14. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  15. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  16. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  17. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  18. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  19. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  20. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  1. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the relevance of accounting for the full range of flood events and their relation to both potential damages and benefits in risk assessments. Management measures may thus be designed to reflect local contexts and support benefits of natural hydrologic processes, while minimizing flood damage.

  2. Stability analysis for discrete-time stochastic memristive neural networks with both leakage and probabilistic delays.

    PubMed

    Liu, Hongjian; Wang, Zidong; Shen, Bo; Huang, Tingwen; Alsaadi, Fuad E

    2018-06-01

    This paper is concerned with the globally exponential stability problem for a class of discrete-time stochastic memristive neural networks (DSMNNs) with both leakage delays as well as probabilistic time-varying delays. For the probabilistic delays, a sequence of Bernoulli distributed random variables is utilized to determine within which intervals the time-varying delays fall at certain time instant. The sector-bounded activation function is considered in the addressed DSMNN. By taking into account the state-dependent characteristics of the network parameters and choosing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are established under which the underlying DSMNN is globally exponentially stable in the mean square. The derived conditions are made dependent on both the leakage and the probabilistic delays, and are therefore less conservative than the traditional delay-independent criteria. A simulation example is given to show the effectiveness of the proposed stability criterion. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  4. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  5. Corroded Anchor Structure Stability/Reliability (CAS_Stab-R) Software for Hydraulic Structures

    DTIC Science & Technology

    2017-12-01

    This report describes software that provides a probabilistic estimate of time -to-failure for a corroding anchor strand system. These anchor...stability to the structure. A series of unique pull-test experiments conducted by Ebeling et al. (2016) at the U.S. Army Engineer Research and...Reliability (CAS_Stab-R) produces probabilistic Remaining Anchor Life time estimates for anchor cables based upon the direct corrosion rate for the

  6. Probabilistic Prognosis of Non-Planar Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Newman, John A.; Warner, James E.; Leser, William P.; Hochhalter, Jacob D.; Yuan, Fuh-Gwo

    2016-01-01

    Quantifying the uncertainty in model parameters for the purpose of damage prognosis can be accomplished utilizing Bayesian inference and damage diagnosis data from sources such as non-destructive evaluation or structural health monitoring. The number of samples required to solve the Bayesian inverse problem through common sampling techniques (e.g., Markov chain Monte Carlo) renders high-fidelity finite element-based damage growth models unusable due to prohibitive computation times. However, these types of models are often the only option when attempting to model complex damage growth in real-world structures. Here, a recently developed high-fidelity crack growth model is used which, when compared to finite element-based modeling, has demonstrated reductions in computation times of three orders of magnitude through the use of surrogate models and machine learning. The model is flexible in that only the expensive computation of the crack driving forces is replaced by the surrogate models, leaving the remaining parameters accessible for uncertainty quantification. A probabilistic prognosis framework incorporating this model is developed and demonstrated for non-planar crack growth in a modified, edge-notched, aluminum tensile specimen. Predictions of remaining useful life are made over time for five updates of the damage diagnosis data, and prognostic metrics are utilized to evaluate the performance of the prognostic framework. Challenges specific to the probabilistic prognosis of non-planar fatigue crack growth are highlighted and discussed in the context of the experimental results.

  7. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  8. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given severities) and vulnerability (the probability of a limit state performance be reached, given a certain severity). Then, for each landslide all the exposed goods (structures and infrastructures) within the landslide area and within a buffer (representative of the maximum extension of a landslide given a reactivation), are counted. The risk is the product of the damage probability and the ratio of the exposed goods of each landslide to the whole assets exposed to the same type of landslides. Since the risk is computed numerically and by the same procedure applied to all landslides, it is free from any subjective assessment such as those implied in the qualitative methods.

  9. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  10. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  11. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper accompanying his map.The first probabilistic hazard maps covering portions of the United States were by Milne and Davenport (1969a). Recently, Wiggins, Hirshberg and Bronowicki (1974) prepared a probabilistic map of maximum particle velocity and Modified Mercalli intensity for the entire United States. The maps are based on an analysis of the historical seismicity. In general, geological data were not incorporated into the development of the maps.

  12. A probabilistic fatigue analysis of multiple site damage

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, S. M.; Ruff, D.; Hillberry, B. M.; Mccabe, G.; Grandt, A. F., Jr.

    1994-01-01

    The variability in initial crack size and fatigue crack growth is incorporated in a probabilistic model that is used to predict the fatigue lives for unstiffened aluminum alloy panels containing multiple site damage (MSD). The uncertainty of the damage in the MSD panel is represented by a distribution of fatigue crack lengths that are analytically derived from equivalent initial flaw sizes. The variability in fatigue crack growth rate is characterized by stochastic descriptions of crack growth parameters for a modified Paris crack growth law. A Monte-Carlo simulation explicitly describes the MSD panel by randomly selecting values from the stochastic variables and then grows the MSD cracks with a deterministic fatigue model until the panel fails. Different simulations investigate the influences of the fatigue variability on the distributions of remaining fatigue lives. Six cases that consider fixed and variable conditions of initial crack size and fatigue crack growth rate are examined. The crack size distribution exhibited a dominant effect on the remaining fatigue life distribution, and the variable crack growth rate exhibited a lesser effect on the distribution. In addition, the probabilistic model predicted that only a small percentage of the life remains after a lead crack develops in the MSD panel.

  13. Probabilistic analysis of the influence of the bonding degree of the stem-cement interface in the performance of cemented hip prostheses.

    PubMed

    Pérez, M A; Grasa, J; García-Aznar, J M; Bea, J A; Doblaré, M

    2006-01-01

    The long-term behavior of the stem-cement interface is one of the most frequent topics of discussion in the design of cemented total hip replacements, especially with regards to the process of damage accumulation in the cement layer. This effect is analyzed here comparing two different situations of the interface: completely bonded and debonded with friction. This comparative analysis is performed using a probabilistic computational approach that considers the variability and uncertainty of determinant factors that directly compromise the damage accumulation in the cement mantle. This stochastic technique is based on the combination of probabilistic finite elements (PFEM) and a cumulative damage approach known as B-model. Three random variables were considered: muscle and joint contact forces at the hip (both for walking and stair climbing), cement damage and fatigue properties of the cement. The results predicted that the regions with higher failure probability in the bulk cement are completely different depending on the stem-cement interface characteristics. In a bonded interface, critical sites appeared at the distal and medial parts of the cement, while for debonded interfaces, the critical regions were found distally and proximally. In bonded interfaces, the failure probability was higher than in debonded ones. The same conclusion may be established for stair climbing in comparison with walking activity.

  14. Hybrid Intrusion Forecasting Framework for Early Warning System

    NASA Astrophysics Data System (ADS)

    Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo

    Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.

  15. Long-term strength and damage accumulation in laminates

    NASA Astrophysics Data System (ADS)

    Dzenis, Yuris A.; Joshi, Shiv P.

    1993-04-01

    A modified version of the probabilistic model developed by authors for damage evolution analysis of laminates subjected to random loading is utilized to predict long-term strength of laminates. The model assumes that each ply in a laminate consists of a large number of mesovolumes. Probabilistic variation functions for mesovolumes stiffnesses as well as strengths are used in the analysis. Stochastic strains are calculated using the lamination theory and random function theory. Deterioration of ply stiffnesses is calculated on the basis of the probabilities of mesovolumes failures using the theory of excursions of random process beyond the limits. Long-term strength and damage accumulation in a Kevlar/epoxy laminate under tension and complex in-plane loading are investigated. Effects of the mean level and stochastic deviation of loading on damage evolution and time-to-failure of laminate are discussed. Long-term cumulative damage at the time of the final failure at low loading levels is more than at high loading levels. The effect of the deviation in loading is more pronounced at lower mean loading levels.

  16. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  17. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  18. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    NASA Astrophysics Data System (ADS)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  19. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  20. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  1. In-situ monitoring and assessment of post barge-bridge collision damage for minimizing traffic delay and detour : final report.

    DOT National Transportation Integrated Search

    2016-07-31

    This report presents a novel framework for promptly assessing the probability of barge-bridge : collision damage of piers based on probabilistic-based classification through machine learning. The main : idea of the presented framework is to divide th...

  2. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  3. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2014-09-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  4. Rainfall-induced landslide vulnerability Assessment in urban area reflecting Urban structure and building characteristics

    NASA Astrophysics Data System (ADS)

    Park, C.; Cho, M.; Lee, D.

    2017-12-01

    Landslide vulnerability assessment methodology of urban area is proposed with urban structure and building charateristics which can consider total damage cost of climate impacts. We used probabilistic analysis method for modeling rainfall-induced shallow landslide susceptibility by slope stability analysis and Monte Carlo simulations. And We combined debris flows with considering spatial movements under topographical condition and built environmental condition. Urban vulnerability of landslide is assessed by two categories: physical demages and urban structure aspect. Physical vulnerability is related to buildings, road, other ubran infra. Urban structure vulnerability is considered a function of the socio-economic factors, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. The analysis was performed in a geographic information system (GIS) environment because GIS can deal efficiently with a large volume of spatial data. The results of the landslide susceptibility assessment were compared with the landslide inventory, and the proposed approach demonstrated good predictive performance. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  5. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  6. Optimization of Systems with Uncertainty: Initial Developments for Performance, Robustness and Reliability Based Designs

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.

  7. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  8. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  9. Robust Control Design for Systems With Probabilistic Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.

  10. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    NASA Astrophysics Data System (ADS)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  11. Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang

    2016-12-01

    Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.

  12. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.

  13. Probabilistic Meteorological Characterization for Turbine Loads

    NASA Astrophysics Data System (ADS)

    Kelly, M.; Larsen, G.; Dimitrov, N. K.; Natarajan, A.

    2014-06-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface layer. Based on both data from multiple sites as well as theoretical bases from boundary-layer meteorology and atmospheric turbulence, we offer probabilistic descriptions of shear and turbulence intensity, elucidating the connection of each to the other as well as to atmospheric stability and terrain. These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations.

  14. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  15. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  16. Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads

    NASA Technical Reports Server (NTRS)

    Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)

    2002-01-01

    Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.

  17. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  18. Integrated health management and control of complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Tolani, Devendra K.

    2005-11-01

    A comprehensive control and health management strategy for human-engineered complex dynamical systems is formulated for achieving high performance and reliability over a wide range of operation. Results from diverse research areas such as Probabilistic Robust Control (PRC), Damage Mitigating/Life Extending Control (DMC), Discrete Event Supervisory (DES) Control, Symbolic Time Series Analysis (STSA) and Health and Usage Monitoring System (HUMS) have been employed to achieve this goal. Continuous-domain control modules at the lower level are synthesized by PRC and DMC theories, whereas the upper-level supervision is based on DES control theory. In the PRC approach, by allowing different levels of risk under different flight conditions, the control system can achieve the desired trade off between stability robustness and nominal performance. In the DMC approach, component damage is incorporated in the control law to reduce the damage rate for enhanced structural durability. The DES controller monitors the system performance and, based on the mission requirements (e.g., performance metrics and level of damage mitigation), switches among various lower-level controllers. The core idea is to design a framework where the DES controller at the upper-level, mimics human intelligence and makes appropriate decisions to satisfy mission requirements, enhance system performance and structural durability. Recently developed tools in STSA have been used for anomaly detection and failure prognosis. The DMC deals with the usage monitoring or operational control part of health management, where as the issue of health monitoring is addressed by the anomaly detection tools. The proposed decision and control architecture has been validated on two test-beds, simulating the operations of rotorcraft dynamics and aircraft propulsion.

  19. Learning probabilistic models of hydrogen bond stability from molecular dynamics simulation trajectories.

    PubMed

    Chikalov, Igor; Yao, Peggy; Moshkov, Mikhail; Latombe, Jean-Claude

    2011-02-15

    Hydrogen bonds (H-bonds) play a key role in both the formation and stabilization of protein structures. They form and break while a protein deforms, for instance during the transition from a non-functional to a functional state. The intrinsic strength of an individual H-bond has been studied from an energetic viewpoint, but energy alone may not be a very good predictor. This paper describes inductive learning methods to train protein-independent probabilistic models of H-bond stability from molecular dynamics (MD) simulation trajectories of various proteins. The training data contains 32 input attributes (predictors) that describe an H-bond and its local environment in a conformation c and the output attribute is the probability that the H-bond will be present in an arbitrary conformation of this protein achievable from c within a time duration Δ. We model dependence of the output variable on the predictors by a regression tree. Several models are built using 6 MD simulation trajectories containing over 4000 distinct H-bonds (millions of occurrences). Experimental results demonstrate that such models can predict H-bond stability quite well. They perform roughly 20% better than models based on H-bond energy alone. In addition, they can accurately identify a large fraction of the least stable H-bonds in a conformation. In most tests, about 80% of the 10% H-bonds predicted as the least stable are actually among the 10% truly least stable. The important attributes identified during the tree construction are consistent with previous findings. We use inductive learning methods to build protein-independent probabilistic models to study H-bond stability, and demonstrate that the models perform better than H-bond energy alone.

  20. Centralized Multi-Sensor Square Root Cubature Joint Probabilistic Data Association

    PubMed Central

    Liu, Jun; Li, Gang; Qi, Lin; Li, Yaowen; He, You

    2017-01-01

    This paper focuses on the tracking problem of multiple targets with multiple sensors in a nonlinear cluttered environment. To avoid Jacobian matrix computation and scaling parameter adjustment, improve numerical stability, and acquire more accurate estimated results for centralized nonlinear tracking, a novel centralized multi-sensor square root cubature joint probabilistic data association algorithm (CMSCJPDA) is proposed. Firstly, the multi-sensor tracking problem is decomposed into several single-sensor multi-target tracking problems, which are sequentially processed during the estimation. Then, in each sensor, the assignment of its measurements to target tracks is accomplished on the basis of joint probabilistic data association (JPDA), and a weighted probability fusion method with square root version of a cubature Kalman filter (SRCKF) is utilized to estimate the targets’ state. With the measurements in all sensors processed CMSCJPDA is derived and the global estimated state is achieved. Experimental results show that CMSCJPDA is superior to the state-of-the-art algorithms in the aspects of tracking accuracy, numerical stability, and computational cost, which provides a new idea to solve multi-sensor tracking problems. PMID:29113085

  1. Centralized Multi-Sensor Square Root Cubature Joint Probabilistic Data Association.

    PubMed

    Liu, Yu; Liu, Jun; Li, Gang; Qi, Lin; Li, Yaowen; He, You

    2017-11-05

    This paper focuses on the tracking problem of multiple targets with multiple sensors in a nonlinear cluttered environment. To avoid Jacobian matrix computation and scaling parameter adjustment, improve numerical stability, and acquire more accurate estimated results for centralized nonlinear tracking, a novel centralized multi-sensor square root cubature joint probabilistic data association algorithm (CMSCJPDA) is proposed. Firstly, the multi-sensor tracking problem is decomposed into several single-sensor multi-target tracking problems, which are sequentially processed during the estimation. Then, in each sensor, the assignment of its measurements to target tracks is accomplished on the basis of joint probabilistic data association (JPDA), and a weighted probability fusion method with square root version of a cubature Kalman filter (SRCKF) is utilized to estimate the targets' state. With the measurements in all sensors processed CMSCJPDA is derived and the global estimated state is achieved. Experimental results show that CMSCJPDA is superior to the state-of-the-art algorithms in the aspects of tracking accuracy, numerical stability, and computational cost, which provides a new idea to solve multi-sensor tracking problems.

  2. Probability of growth of small damage sites on the exit surface of fused silica optics.

    PubMed

    Negres, Raluca A; Abdulla, Ghaleb M; Cross, David A; Liao, Zhi M; Carr, Christopher W

    2012-06-04

    Growth of laser damage on fused silica optical components depends on several key parameters including laser fluence, wavelength, pulse duration, and site size. Here we investigate the growth behavior of small damage sites on the exit surface of SiO₂ optics under exposure to tightly controlled laser pulses. Results demonstrate that the onset of damage growth is not governed by a threshold, but is probabilistic in nature and depends both on the current size of a damage site and the laser fluence to which it is exposed. We also develop models for use in growth prediction. In addition, we show that laser exposure history also influences the behavior of individual sites.

  3. Sarma-based key-group method for rock slope reliability analyses

    NASA Astrophysics Data System (ADS)

    Yarahmadi Bafghi, A. R.; Verdel, T.

    2005-08-01

    The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright

  4. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    PubMed

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  5. Landslide prediction using combined deterministic and probabilistic methods in hilly area of Mt. Medvednica in Zagreb City, Croatia

    NASA Astrophysics Data System (ADS)

    Wang, Chunxiang; Watanabe, Naoki; Marui, Hideaki

    2013-04-01

    The hilly slopes of Mt. Medvednica are stretched in the northwestern part of Zagreb City, Croatia, and extend to approximately 180km2. In this area, landslides, e.g. Kostanjek landslide and Črešnjevec landslide, have brought damage to many houses, roads, farmlands, grassland and etc. Therefore, it is necessary to predict the potential landslides and to enhance landslide inventory for hazard mitigation and security management of local society in this area. We combined deterministic method and probabilistic method to assess potential landslides including their locations, size and sliding surfaces. Firstly, this study area is divided into several slope units that have similar topographic and geological characteristics using the hydrology analysis tool in ArcGIS. Then, a GIS-based modified three-dimensional Hovland's method for slope stability analysis system is developed to identify the sliding surface and corresponding three-dimensional safety factor for each slope unit. Each sliding surface is assumed to be the lower part of each ellipsoid. The direction of inclination of the ellipsoid is considered to be the same as the main dip direction of the slope unit. The center point of the ellipsoid is randomly set to the center point of a grid cell in the slope unit. The minimum three-dimensional safety factor and corresponding critical sliding surface are also obtained for each slope unit. Thirdly, since a single value of safety factor is insufficient to evaluate the slope stability of a slope unit, the ratio of the number of calculation cases in which the three-dimensional safety factor values less than 1.0 to the total number of trial calculation is defined as the failure probability of the slope unit. If the failure probability is more than 80%, the slope unit is distinguished as 'unstable' from other slope units and the landslide hazard can be mapped for the whole study area.

  6. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering Held at Vicksburg, Mississippi on 21 September 1982.

    DTIC Science & Technology

    1983-09-01

    al. (1981) was conducted on Copper City No. 2 tailings embankment damn near Miami, Arizona . Due to the extreme topographic relief in the area of the...mode of behavior and scale. ThiL dependency is summarized in the factor R. For example, circular shear instability as in a copper porphyry slope...OF THE PROBABILISTIC SLOPE STABILITY MODEL. . 32 6.1 DESCRIPTION OF COPPER CITY NUMBER 2 TAILINGS DAM . . 32 6.2 SUBSURFACE INVESTIGATION

  7. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  8. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  9. Performance-based seismic assessment of skewed bridges with and without considering soil-foundation interaction effects for various site classes

    NASA Astrophysics Data System (ADS)

    Ghotbi, Abdoul R.

    2014-09-01

    The seismic behavior of skewed bridges has not been well studied compared to straight bridges. Skewed bridges have shown extensive damage, especially due to deck rotation, shear keys failure, abutment unseating and column-bent drift. This research, therefore, aims to study the behavior of skewed and straight highway overpass bridges both with and without taking into account the effects of Soil-Structure Interaction (SSI) due to near-fault ground motions. Due to several sources of uncertainty associated with the ground motions, soil and structure, a probabilistic approach is needed. Thus, a probabilistic methodology similar to the one developed by the Pacific Earthquake Engineering Research Center (PEER) has been utilized to assess the probability of damage due to various levels of shaking using appropriate intensity measures with minimum dispersions. The probabilistic analyses were performed for various bridge configurations and site conditions, including sand ranging from loose to dense and clay ranging from soft to stiff, in order to evaluate the effects. The results proved a considerable susceptibility of skewed bridges to deck rotation and shear keys displacement. It was also found that SSI had a decreasing effect on the damage probability for various demands compared to the fixed-base model without including SSI. However, deck rotation for all types of the soil and also abutment unseating for very loose sand and soft clay showed an increase in damage probability compared to the fixed-base model. The damage probability for various demands has also been found to decrease with an increase of soil strength for both sandy and clayey sites. With respect to the variations in the skew angle, an increase in skew angle has had an increasing effect on the amplitude of the seismic response for various demands. Deck rotation has been very sensitive to the increase in the skew angle; therefore, as the skew angle increased, the deck rotation responded accordingly. Furthermore, abutment unseating showed an increasing trend due to an increase in skew angle for both fixed-base and SSI models.

  10. A probabilistic tornado wind hazard model for the continental United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hossain, Q; Kimball, J; Mensing, R

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectanglemore » and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.« less

  11. 46 CFR 172.065 - Damage stability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Damage stability. 172.065 Section 172.065 Shipping COAST... § 172.065 Damage stability. (a) Definitions. As used in this section, Length or L means load line length... paragraph (c) of this section, assuming the damage specified in paragraph (d) of this section. (c...

  12. 46 CFR 172.065 - Damage stability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Damage stability. 172.065 Section 172.065 Shipping COAST... § 172.065 Damage stability. (a) Definitions. As used in this section, Length or L means load line length... paragraph (c) of this section, assuming the damage specified in paragraph (d) of this section. (c...

  13. 46 CFR 172.065 - Damage stability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Damage stability. 172.065 Section 172.065 Shipping COAST... § 172.065 Damage stability. (a) Definitions. As used in this section, Length or L means load line length... paragraph (c) of this section, assuming the damage specified in paragraph (d) of this section. (c...

  14. Dynamics and Adaptive Control for Stability Recovery of Damaged Aircraft

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan; Krishnakumar, Kalmanje; Kaneshige, John; Nespeca, Pascal

    2006-01-01

    This paper presents a recent study of a damaged generic transport model as part of a NASA research project to investigate adaptive control methods for stability recovery of damaged aircraft operating in off-nominal flight conditions under damage and or failures. Aerodynamic modeling of damage effects is performed using an aerodynamic code to assess changes in the stability and control derivatives of a generic transport aircraft. Certain types of damage such as damage to one of the wings or horizontal stabilizers can cause the aircraft to become asymmetric, thus resulting in a coupling between the longitudinal and lateral motions. Flight dynamics for a general asymmetric aircraft is derived to account for changes in the center of gravity that can compromise the stability of the damaged aircraft. An iterative trim analysis for the translational motion is developed to refine the trim procedure by accounting for the effects of the control surface deflection. A hybrid direct-indirect neural network, adaptive flight control is proposed as an adaptive law for stabilizing the rotational motion of the damaged aircraft. The indirect adaptation is designed to estimate the plant dynamics of the damaged aircraft in conjunction with the direct adaptation that computes the control augmentation. Two approaches are presented 1) an adaptive law derived from the Lyapunov stability theory to ensure that the signals are bounded, and 2) a recursive least-square method for parameter identification. A hardware-in-the-loop simulation is conducted and demonstrates the effectiveness of the direct neural network adaptive flight control in the stability recovery of the damaged aircraft. A preliminary simulation of the hybrid adaptive flight control has been performed and initial data have shown the effectiveness of the proposed hybrid approach. Future work will include further investigations and high-fidelity simulations of the proposed hybrid adaptive Bight control approach.

  15. Flight dynamics and control modelling of damaged asymmetric aircraft

    NASA Astrophysics Data System (ADS)

    Ogunwa, T. T.; Abdullah, E. J.

    2016-10-01

    This research investigates the use of a Linear Quadratic Regulator (LQR) controller to assist commercial Boeing 747-200 aircraft regains its stability in the event of damage. Damages cause an aircraft to become asymmetric and in the case of damage to a fraction (33%) of its left wing or complete loss of its vertical stabilizer, the loss of stability may lead to a fatal crash. In this study, aircraft models for the two damage scenarios previously mentioned are constructed using stability derivatives. LQR controller is used as a direct adaptive control design technique for the observable and controllable system. Dynamic stability analysis is conducted in the time domain for all systems in this study.

  16. Probabilistic Model for Laser Damage to the Human Retina

    DTIC Science & Technology

    2012-03-01

    the beam. Power density may be measured in radiant exposure, J cm2 , or by irradiance , W cm2 . In the experimental database used in this study and...to quan- tify a binary response, either lethal or non-lethal, within a population such as insects or rats. In directed energy research, probit...value of the normalized Arrhenius damage integral. In a one-dimensional simulation, the source term is determined as a spatially averaged irradiance (W

  17. Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Singhal, S. N.; Chamis, C. C.

    1996-01-01

    This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.

  18. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  19. Landslide Hazard Probability Derived from Inherent and Dynamic Determinants

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan

    2016-04-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.

  20. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  1. Sensitivity of Asteroid Impact Risk to Uncertainty in Asteroid Properties and Entry Parameters

    NASA Astrophysics Data System (ADS)

    Wheeler, Lorien; Mathias, Donovan; Dotson, Jessie L.; NASA Asteroid Threat Assessment Project

    2017-10-01

    A central challenge in assessing the threat posed by asteroids striking Earth is the large amount of uncertainty inherent throughout all aspects of the problem. Many asteroid properties are not well characterized and can range widely from strong, dense, monolithic irons to loosely bound, highly porous rubble piles. Even for an object of known properties, the specific entry velocity, angle, and impact location can swing the potential consequence from no damage to causing millions of casualties. Due to the extreme rarity of large asteroid strikes, there are also large uncertainties in how different types of asteroids will interact with the atmosphere during entry, how readily they may break up or ablate, and how much surface damage will be caused by the resulting airbursts or impacts.In this work, we use our Probabilistic Asteroid Impact Risk (PAIR) model to investigate the sensitivity of asteroid impact damage to uncertainties in key asteroid properties, entry parameters, or modeling assumptions. The PAIR model combines physics-based analytic models of asteroid entry and damage in a probabilistic Monte Carlo framework to assess the risk posed by a wide range of potential impacts. The model samples from uncertainty distributions of asteroid properties and entry parameters to generate millions of specific impact cases, and models the atmospheric entry and damage for each case, including blast overpressure, thermal radiation, tsunami inundation, and global effects. To assess the risk sensitivity, we alternately fix and vary the different input parameters and compare the effect on the resulting range of damage produced. The goal of these studies is to help guide future efforts in asteroid characterization and model refinement by determining which properties most significantly affect the potential risk.

  2. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  3. Probabilistic Flexural Fatigue in Plain and Fiber-Reinforced Concrete

    PubMed Central

    Ríos, José D.

    2017-01-01

    The objective of this work is two-fold. First, we attempt to fit the experimental data on the flexural fatigue of plain and fiber-reinforced concrete with a probabilistic model (Saucedo, Yu, Medeiros, Zhang and Ruiz, Int. J. Fatigue, 2013, 48, 308–318). This model was validated for compressive fatigue at various loading frequencies, but not for flexural fatigue. Since the model is probabilistic, it is not necessarily related to the specific mechanism of fatigue damage, but rather generically explains the fatigue distribution in concrete (plain or reinforced with fibers) for damage under compression, tension or flexion. In this work, more than 100 series of flexural fatigue tests in the literature are fit with excellent results. Since the distribution of monotonic tests was not available in the majority of cases, a two-step procedure is established to estimate the model parameters based solely on fatigue tests. The coefficient of regression was more than 0.90 except for particular cases where not all tests were strictly performed under the same loading conditions, which confirms the applicability of the model to flexural fatigue data analysis. Moreover, the model parameters are closely related to fatigue performance, which demonstrates the predictive capacity of the model. For instance, the scale parameter is related to flexural strength, which improves with the addition of fibers. Similarly, fiber increases the scattering of fatigue life, which is reflected by the decreasing shape parameter. PMID:28773123

  4. Probabilistic Flexural Fatigue in Plain and Fiber-Reinforced Concrete.

    PubMed

    Ríos, José D; Cifuentes, Héctor; Yu, Rena C; Ruiz, Gonzalo

    2017-07-07

    The objective of this work is two-fold. First, we attempt to fit the experimental data on the flexural fatigue of plain and fiber-reinforced concrete with a probabilistic model (Saucedo, Yu, Medeiros, Zhang and Ruiz, Int. J. Fatigue, 2013, 48, 308-318). This model was validated for compressive fatigue at various loading frequencies, but not for flexural fatigue. Since the model is probabilistic, it is not necessarily related to the specific mechanism of fatigue damage, but rather generically explains the fatigue distribution in concrete (plain or reinforced with fibers) for damage under compression, tension or flexion. In this work, more than 100 series of flexural fatigue tests in the literature are fit with excellent results. Since the distribution of monotonic tests was not available in the majority of cases, a two-step procedure is established to estimate the model parameters based solely on fatigue tests. The coefficient of regression was more than 0.90 except for particular cases where not all tests were strictly performed under the same loading conditions, which confirms the applicability of the model to flexural fatigue data analysis. Moreover, the model parameters are closely related to fatigue performance, which demonstrates the predictive capacity of the model. For instance, the scale parameter is related to flexural strength, which improves with the addition of fibers. Similarly, fiber increases the scattering of fatigue life, which is reflected by the decreasing shape parameter.

  5. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  6. Time dependence of breakdown in a global fiber-bundle model with continuous damage.

    PubMed

    Moral, L; Moreno, Y; Gómez, J B; Pacheco, A F

    2001-06-01

    A time-dependent global fiber-bundle model of fracture with continuous damage is formulated in terms of a set of coupled nonlinear differential equations. A first integral of this set is analytically obtained. The time evolution of the system is studied by applying a discrete probabilistic method. Several results are discussed emphasizing their differences with the standard time-dependent model. The results obtained show that with this simple model a variety of experimental observations can be qualitatively reproduced.

  7. Design of Composite Structures for Reliability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1999-01-01

    A summary of research conducted during the first year is presented. The research objectives were sought by conducting two tasks: (1) investigation of probabilistic design techniques for reliability-based design of composite sandwich panels, and (2) examination of strain energy density failure criterion in conjunction with response surface methodology for global-local design of damage tolerant helicopter fuselage structures. This report primarily discusses the efforts surrounding the first task and provides a discussion of some preliminary work involving the second task.

  8. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  9. Approximation of state variables for discrete-time stochastic genetic regulatory networks with leakage, distributed, and probabilistic measurement delays: a robust stability problem.

    PubMed

    Pandiselvi, S; Raja, R; Cao, Jinde; Rajchakit, G; Ahmad, Bashir

    2018-01-01

    This work predominantly labels the problem of approximation of state variables for discrete-time stochastic genetic regulatory networks with leakage, distributed, and probabilistic measurement delays. Here we design a linear estimator in such a way that the absorption of mRNA and protein can be approximated via known measurement outputs. By utilizing a Lyapunov-Krasovskii functional and some stochastic analysis execution, we obtain the stability formula of the estimation error systems in the structure of linear matrix inequalities under which the estimation error dynamics is robustly exponentially stable. Further, the obtained conditions (in the form of LMIs) can be effortlessly solved by some available software packages. Moreover, the specific expression of the desired estimator is also shown in the main section. Finally, two mathematical illustrative examples are accorded to show the advantage of the proposed conceptual results.

  10. Evolution and stability of altruist strategies in microbial games

    NASA Astrophysics Data System (ADS)

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2012-01-01

    When microbes compete for limited resources, they often engage in chemical warfare using bacterial toxins. This competition can be understood in terms of evolutionary game theory (EGT). We study the predictions of EGT for the bacterial “suicide bomber” game in terms of the phase portraits of population dynamics, for parameter combinations that cover all interesting games for two-players, and seven of the 38 possible phase portraits of the three-player game. We compare these predictions to simulations of these competitions in finite well-mixed populations, but also allowing for probabilistic rather than pure strategies, as well as Darwinian adaptation over tens of thousands of generations. We find that Darwinian evolution of probabilistic strategies stabilizes games of the rock-paper-scissors type that emerge for parameters describing realistic bacterial populations, and point to ways in which the population fixed point can be selected by changing those parameters.

  11. Probabilistic analysis algorithm for UA slope software program.

    DOT National Transportation Integrated Search

    2013-12-01

    A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...

  12. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  13. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  14. Probabilistic estimation of numbers and costs of future landslides in the San Francisco Bay region

    USGS Publications Warehouse

    Crovelli, R.A.; Coe, J.A.

    2009-01-01

    We used historical records of damaging landslides triggered by rainstorms and a newly developed Probabilistic Landslide Assessment Cost Estimation System (PLACES) to estimate the numbers and direct costs of future landslides in the 10-county San Francisco Bay region. Historical records of damaging landslides in the region are incomplete. Therefore, our estimates of numbers and costs of future landslides are minimal estimates. The estimated mean annual number of future damaging landslides for the entire 10-county region is about 65. Santa Cruz County has the highest estimated mean annual number of damaging future landslides (about 18), whereas Napa, San Francisco, and Solano Counties have the lowest estimated mean numbers of damaging landslides (about 1 each). The estimated mean annual cost of future landslides in the entire region is about US $14.80 million (year 2000 $). The estimated mean annual cost is highest for San Mateo County ($3.24 million) and lowest for Solano County ($0.18 million). The annual per capita cost for the entire region will be about $2.10. Santa Cruz County will have the highest annual per capita cost at $8.45, whereas San Francisco County will have the lowest per capita cost at $0.31. Normalising costs by dividing by the percentage of land area with slopes equal to or greater than 17% indicates that San Francisco County will have the highest cost per square km ($7,101), whereas Santa Clara County will have the lowest cost per square km ($229). These results indicate that the San Francisco Bay region has one of the highest levels of landslide risk in the United States. Compared with landslide cost estimates from the rest of the world, the risk level in the Bay region seems high, but not exceptionally high.

  15. Nonlinear analysis of NPP safety against the aircraft attack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk; Králik, Juraj, E-mail: kralik@fa.stuba.sk

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  16. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  17. Study of Composite Plate Damages Using Embedded PZT Sensors with Various Center Frequency

    NASA Astrophysics Data System (ADS)

    Kang, Kyoung-Tak; Chun, Heoung-Jae; Son, Ju-Hyun; Byun, Joon-Hyung; Um, Moon-Kwang; Lee, Sang-Kwan

    This study presents part of an experimental and analytical survey of candidate methods for damage detection of composite structural. Embedded piezoceramic (PZT) sensors were excited with the high power ultrasonic wave generator generating a propagation of stress wave along the composite plate. The same embedded piezoceramic (PZT) sensors are used as receivers for acquiring stress signals. The effects of center frequency of embedded sensor were evaluated for the damage identification capability with known localized defects. The study was carried out to assess damage in composite plate by fusing information from multiple sensing paths of the embedded network. It was based on the Hilbert transform, signal correlation and probabilistic searching. The obtained results show that satisfactory detection of defects could be achieved by proposed method.

  18. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 3

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.

  19. 46 CFR 179.212 - Watertight bulkheads for subdivision and damage stability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... this chapter, a monohull vessel which undergoes a simplified stability proof test in accordance with... stability. 179.212 Section 179.212 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) SUBDIVISION, DAMAGE STABILITY, AND WATERTIGHT INTEGRITY Subdivision...

  20. 46 CFR 179.212 - Watertight bulkheads for subdivision and damage stability.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... this chapter, a monohull vessel which undergoes a simplified stability proof test in accordance with... stability. 179.212 Section 179.212 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) SUBDIVISION, DAMAGE STABILITY, AND WATERTIGHT INTEGRITY Subdivision...

  1. 46 CFR 179.212 - Watertight bulkheads for subdivision and damage stability.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... this chapter, a monohull vessel which undergoes a simplified stability proof test in accordance with... stability. 179.212 Section 179.212 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) SUBDIVISION, DAMAGE STABILITY, AND WATERTIGHT INTEGRITY Subdivision...

  2. 46 CFR 179.212 - Watertight bulkheads for subdivision and damage stability.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... this chapter, a monohull vessel which undergoes a simplified stability proof test in accordance with... stability. 179.212 Section 179.212 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) SUBDIVISION, DAMAGE STABILITY, AND WATERTIGHT INTEGRITY Subdivision...

  3. Probabilistic inspection strategies for minimizing service failures

    NASA Technical Reports Server (NTRS)

    Brot, Abraham

    1994-01-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  4. Functional Topography of Early Periventricular Brain Lesions in Relation to Cytoarchitectonic Probabilistic Maps

    ERIC Educational Resources Information Center

    Staudt, Martin; Ticini, Luca F.; Grodd, Wolfgang; Krageloh-Mann, Ingeborg; Karnath, Hans-Otto

    2008-01-01

    Early periventricular brain lesions can not only cause cerebral palsy, but can also induce a reorganization of language. Here, we asked whether these different functional consequences can be attributed to topographically distinct portions of the periventricular white matter damage. Eight patients with pre- and perinatally acquired left-sided…

  5. Effect of α-damage on fission-track annealing in zircon

    USGS Publications Warehouse

    Kasuya, Masao; Naeser, Charles W.

    1988-01-01

    The thermal stability of confined fission-track lengths in four zircon samples having different spontaneous track densities (i.e., different amounts of ??-damage) has been studied by one-hour isochronal annealing experiments. The thermal stability of spontaneous track lengths is independent of initial spontaneous track density. The thermal stability of induced track lengths in pre-annealed zircon, however, is significantly higher than that of spontaneous track lengths. The results indicate that the presence of ??-damage lowers the thermal stability of fission-tracks in zircon.

  6. Empirical Fragility Analysis of Buildings and Boats Damaged By the 2011 Great East Japan Tsunami and Their Practical Application

    NASA Astrophysics Data System (ADS)

    Suppasri, A.; Charvet, I.; Leelawat, N.; Fukutani, Y.; Muhari, A.; Futami, T.; Imamura, F.

    2014-12-01

    This study focused in turn on detailed data of buildings and boats damage caused by the 2011 tsunami in order to understand its main causes and provide damage probability estimates. Tsunami-induced building damage data was collected from field surveys, and includes inundation depth, building material, number of stories and occupancy type for more than 80,000 buildings. Numerical simulations with high resolution bathymetry and topography data were conducted to obtain characteristic tsunami measures such as flow velocity. These data were analyzed using advanced statistical methods, ordinal regression analysis to create not only empirical 2D tsunami fragility curves, but also 3D tsunami fragility surfaces for the first time. The effect of floating debris was also considered, by using a binary indicator of debris impact based on the proximity of a structure from a debris source (i.e. washed away building). Both the 2D and 3D fragility analyses provided results for each different building damage level, and different topography. While 2D fragility curves provide easily interpretable results relating tsunami flow depth to damage probability for different damage levels, 3D fragility surfaces allow for several influential tsunami parameters to be taken into account thus reduce uncertainty in the probability estimations. More than 20,000 damaged boats were used in the analysis similar to the one carried out on the buildings. Detailed data for each boat comprises information on the damage ratio (paid value over insured value), tonnage, engine type, material type and damage classification. The 2D and 3D fragility analyses were developed using representative tsunami heights for each port obtained from field surveys and flow velocities obtained from the aforementioned simulations. The results are currently being adapted for practical disaster mitigation. They are being integrated with the probabilistic tsunami hazard analysis, in order to create offshore and onshore probabilistic hazard maps. Through the GPS and embedded calculation function based on the aforementioned fragility results, these applications can be used in the field for a quick estimation of possible building damage, as well as a decision support system for fishermen (whether or not they should move their boats to the deep sea upon tsunami arrival).

  7. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    USGS Publications Warehouse

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0.17 when only one hazard is considered and a score of 0.37 when multiple hazards are considered simultaneously. The LHIs with the most predictive skill were ‘Inundation depth’ and ‘Wave attack’. The Bayesian Network approach has several advantages over the market-standard stage-damage functions: the predictive capacity of multiple indicators can be combined; probabilistic predictions can be obtained, which include uncertainty; and quantitative as well as descriptive information can be used simultaneously.

  8. Damage and Loss Estimation for Natural Gas Networks: The Case of Istanbul

    NASA Astrophysics Data System (ADS)

    Çaktı, Eser; Hancılar, Ufuk; Şeşetyan, Karin; Bıyıkoǧlu, Hikmet; Şafak, Erdal

    2017-04-01

    Natural gas networks are one of the major lifeline systems to support human, urban and industrial activities. The continuity of gas supply is critical for almost all functions of modern life. Under natural phenomena such as earthquakes and landslides the damages to the system elements may lead to explosions and fires compromising human life and damaging physical environment. Furthermore, the disruption in the gas supply puts human activities at risk and also results in economical losses. This study is concerned with the performance of one of the largest natural gas distribution systems in the world. Physical damages to Istanbul's natural gas network are estimated under the most recent probabilistic earthquake hazard models available, as well as under simulated ground motions from physics based models. Several vulnerability functions are used in modelling damages to system elements. A first-order assessment of monetary losses to Istanbul's natural gas distribution network is also attempted.

  9. Reliability assessment of slender concrete columns at the stability failure

    NASA Astrophysics Data System (ADS)

    Valašík, Adrián; Benko, Vladimír; Strauss, Alfred; Täubling, Benjamin

    2018-01-01

    The European Standard for designing concrete columns within the use of non-linear methods shows deficiencies in terms of global reliability, in case that the concrete columns fail by the loss of stability. The buckling failure is a brittle failure which occurs without warning and the probability of its formation depends on the columns slenderness. Experiments with slender concrete columns were carried out in cooperation with STRABAG Bratislava LTD in Central Laboratory of Faculty of Civil Engineering SUT in Bratislava. The following article aims to compare the global reliability of slender concrete columns with slenderness of 90 and higher. The columns were designed according to methods offered by EN 1992-1-1 [1]. The mentioned experiments were used as basis for deterministic nonlinear modelling of the columns and subsequent the probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

  10. The effect of α-damage on fission-track annealing in zircon

    USGS Publications Warehouse

    Kasuya, M.; Naeser, C.W.

    1988-01-01

    The thermal stability of confined fission-track lengths in four zircon samples having different spontaneous track densities (i.e. different amounts of ??-damage) has been studied by one hour isochronal annealing experiments. The thermal stability of spontaneous track lengths is independent of initial spontaneous track density. The thermal stability of induced track lengths in pre-annealed zircon, however, is significantly higher than that of spontaneous track lengths. The results indicate that the presence of ??-damage lowers the thermal stability of fission-tracks in zircon. ?? 1988.

  11. Mitochondria damage checkpoint in apoptosis and genome stability.

    PubMed

    Singh, Keshav K

    2004-11-01

    Mitochondria perform multiple cellular functions including energy production, cell proliferation and apoptosis. Studies described in this paper suggest a role for mitochondria in maintaining genomic stability. Genomic stability appears to be dependent on mitochondrial functions involved in maintenance of proper intracellular redox status, ATP-dependent transcription, DNA replication, DNA repair and DNA recombination. To further elucidate the role of mitochondria in genomic stability, I propose a mitochondria damage checkpoint (mitocheckpoint) that monitors and responds to damaged mitochondria. Mitocheckpoint can coordinate and maintain proper balance between apoptotic and anti-apoptotic signals. When mitochondria are damaged, mitocheckpoint can be activated to help cells repair damaged mitochondria, to restore normal mitochondrial function and avoid production of mitochondria-defective cells. If mitochondria are severely damaged, mitocheckpoint may not be able to repair the damage and protect cells. Such an event triggers apoptosis. If damage to mitochondria is continuous or persistent such as damage to mitochondrial DNA resulting in mutations, mitocheckpoint may fail which can lead to genomic instability and increased cell survival in yeast. In human it can cause cancer. In support of this proposal we provide evidence that mitochondrial genetic defects in both yeast and mammalian systems lead to impaired DNA repair, increased genomic instability and increased cell survival. This study reveals molecular genetic mechanisms underlying a role for mitochondria in carcinogenesis in humans.

  12. 46 CFR 174.070 - General damage stability assumptions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false General damage stability assumptions. 174.070 Section 174.070 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling...

  13. 46 CFR 174.070 - General damage stability assumptions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false General damage stability assumptions. 174.070 Section 174.070 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling...

  14. 46 CFR 174.065 - Damage stability requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Damage stability requirements. 174.065 Section 174.065 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling Units § 174.065...

  15. 46 CFR 174.065 - Damage stability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Damage stability requirements. 174.065 Section 174.065 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Mobile Offshore Drilling Units § 174.065...

  16. Modeling landslide recurrence in Seattle, Washington, USA

    USGS Publications Warehouse

    Salciarini, Diana; Godt, Jonathan W.; Savage, William Z.; Baum, Rex L.; Conversini, Pietro

    2008-01-01

    To manage the hazard associated with shallow landslides, decision makers need an understanding of where and when landslides may occur. A variety of approaches have been used to estimate the hazard from shallow, rainfall-triggered landslides, such as empirical rainfall threshold methods or probabilistic methods based on historical records. The wide availability of Geographic Information Systems (GIS) and digital topographic data has led to the development of analytic methods for landslide hazard estimation that couple steady-state hydrological models with slope stability calculations. Because these methods typically neglect the transient effects of infiltration on slope stability, results cannot be linked with historical or forecasted rainfall sequences. Estimates of the frequency of conditions likely to cause landslides are critical for quantitative risk and hazard assessments. We present results to demonstrate how a transient infiltration model coupled with an infinite slope stability calculation may be used to assess shallow landslide frequency in the City of Seattle, Washington, USA. A module called CRF (Critical RainFall) for estimating deterministic rainfall thresholds has been integrated in the TRIGRS (Transient Rainfall Infiltration and Grid-based Slope-Stability) model that combines a transient, one-dimensional analytic solution for pore-pressure response to rainfall infiltration with an infinite slope stability calculation. Input data for the extended model include topographic slope, colluvial thickness, initial water-table depth, material properties, and rainfall durations. This approach is combined with a statistical treatment of rainfall using a GEV (General Extreme Value) probabilistic distribution to produce maps showing the shallow landslide recurrence induced, on a spatially distributed basis, as a function of rainfall duration and hillslope characteristics.

  17. Seismically induced landslides: current research by the US Geological Survey.

    USGS Publications Warehouse

    Harp, E.L.; Wilson, R.C.; Keefer, D.K.; Wieczorek, G.F.

    1986-01-01

    We have produced a regional seismic slope-stability map and a probabilistic prediction of landslide distribution from a postulated earthquake. For liquefaction-induced landslides, in situ measurements of seismically induced pore-water pressures have been used to establish an elastic model of pore pressure generation. -from Authors

  18. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  19. Optimization of monitoring and inspections in the life-cycle of wind turbines

    NASA Astrophysics Data System (ADS)

    Hanish Nithin, Anu; Omenzetter, Piotr

    2016-04-01

    The past decade has witnessed a surge in the offshore wind farm developments across the world. Although this form of cleaner and greener energy is beneficial and eco-friendly, the production of wind energy entails high life-cycle costs. The costs associated with inspections, monitoring and repairs of wind turbines are primary contributors to the high costs of electricity produced in this way and are disadvantageous in today's competitive economic environment. There is limited research being done in the probabilistic optimization of life-cycle costs of offshore wind turbines structures and their components. This paper proposes a framework for assessing the life cycle cost of wind turbine structures subject to damage and deterioration. The objective of the paper is to develop a mathematical probabilistic cost assessment framework which considers deterioration, inspection, monitoring, repair and maintenance models and their uncertainties. The uncertainties are etched in the accuracy and precision of the monitoring and inspection methods and can be considered through the probability of damage detection of each method. Schedules for inspection, monitoring and repair actions are demonstrated using a decision tree. Examples of a generalised deterioration process integrated with the cost analysis using a decision tree are shown for a wind turbine foundation structure.

  20. Computational Prediction of Shock Ignition Thresholds and Ignition Probability of Polymer-Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Wei, Yaochi; Kim, Seokpum; Horie, Yasuyuki; Zhou, Min

    2017-06-01

    A computational approach is developed to predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs). The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific damage mechanisms considered include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to mimic relevant experiments for statistical variations of material behavior due to inherent material heterogeneities. The ignition thresholds and corresponding ignition probability maps are predicted for PBX 9404 and PBX 9501 for the impact loading regime of Up = 200 --1200 m/s. James and Walker-Wasley relations are utilized to establish explicit analytical expressions for the ignition probability as a function of load intensities. The predicted results are in good agreement with available experimental measurements. The capability to computationally predict the macroscopic response out of material microstructures and basic constituent properties lends itself to the design of new materials and the analysis of existing materials. The authors gratefully acknowledge the support from Air Force Office of Scientific Research (AFOSR) and the Defense Threat Reduction Agency (DTRA).

  1. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  2. Climatology and Predictability of Cool-Season High Wind Events in the New York City Metropolitan and Surrounding Area

    NASA Astrophysics Data System (ADS)

    Layer, Michael

    Damaging wind events not associated with severe convective storms or tropical cyclones can occur over the Northeast U.S. during the cool season and can cause significant problems with transportation, infrastructure, and public safety. These non-convective wind events (NCWEs) events are difficult for operational forecasters to predict in the NYC region as revealed by relatively poor verification statistics in recent years. This study investigates the climatology of NCWEs occurring between 15 September and 15 May over 13 seasons from 2000-2001 through 2012-2013. The events are broken down into three distinct types commonly observed in the region: pre-cold frontal (PRF), post-cold frontal (POF), and nor'easter/coastal storm (NEC) cases. Relationships between observed winds and some atmospheric parameters such as 900 hPa height gradient, 3-hour MSLP tendency, low-level wind profile, and stability are also studied. Overall, PRF and NEC events exhibit stronger height gradients, stronger low-level winds, and stronger low-level stability than POF events. Model verification is also conducted over the 2009-2014 time period using the Short Range Ensemble Forecast system (SREF) from the National Centers for Environmental Prediction (NCEP). Both deterministic and probabilistic verification metrics are used to evaluate the performance of the ensemble during NCWEs. Although the SREF has better forecast skill than most of the deterministic SREF control members, it is rather poorly calibrated, and exhibits a significant overforecasting, or positive wind speed bias in the lower atmosphere.

  3. Damage evaluation by a guided wave-hidden Markov model based method

    NASA Astrophysics Data System (ADS)

    Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin

    2016-02-01

    Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.

  4. Scalable Algorithms for Global Scale Remote Sensing Applications

    NASA Astrophysics Data System (ADS)

    Vatsavai, R. R.; Bhaduri, B. L.; Singh, N.

    2015-12-01

    Recent decade has witnessed major changes on the Earth, for example, deforestation, varying cropping and human settlement patterns, and crippling damages due to disasters. Accurate damage assessment caused by major natural and anthropogenic disasters is becoming critical due to increases in human and economic loss. This increase in loss of life and severe damages can be attributed to the growing population, as well as human migration to the disaster prone regions of the world. Rapid assessment of these changes and dissemination of accurate information is critical for creating an effective emergency response. Change detection using high-resolution satellite images is a primary tool in assessing damages, monitoring biomass and critical infrastructures, and identifying new settlements. Existing change detection methods suffer from registration errors and often based on pixel (location) wise comparison of spectral observations from single sensor. In this paper we present a novel probabilistic change detection framework based on patch comparison and a GPU implementation that supports near real-time rapid damage exploration capability.

  5. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  6. Aerodynamic Effects and Modeling of Damage to Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shah, Gautam H.

    2008-01-01

    A wind tunnel investigation was conducted to measure the aerodynamic effects of damage to lifting and stability/control surfaces of a commercial transport aircraft configuration. The modeling of such effects is necessary for the development of flight control systems to recover aircraft from adverse, damage-related loss-of-control events, as well as for the estimation of aerodynamic characteristics from flight data under such conditions. Damage in the form of partial or total loss of area was applied to the wing, horizontal tail, and vertical tail. Aerodynamic stability and control implications of damage to each surface are presented, to aid in the identification of potential boundaries in recoverable stability or control degradation. The aerodynamic modeling issues raised by the wind tunnel results are discussed, particularly the additional modeling requirements necessitated by asymmetries due to damage, and the potential benefits of such expanded modeling.

  7. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  8. A Hypergraph and Arithmetic Residue-based Probabilistic Neural Network for classification in Intrusion Detection Systems.

    PubMed

    Raman, M R Gauthama; Somu, Nivethitha; Kirthivasan, Kannan; Sriram, V S Shankar

    2017-08-01

    Over the past few decades, the design of an intelligent Intrusion Detection System (IDS) remains an open challenge to the research community. Continuous efforts by the researchers have resulted in the development of several learning models based on Artificial Neural Network (ANN) to improve the performance of the IDSs. However, there exists a tradeoff with respect to the stability of ANN architecture and the detection rate for less frequent attacks. This paper presents a novel approach based on Helly property of Hypergraph and Arithmetic Residue-based Probabilistic Neural Network (HG AR-PNN) to address the classification problem in IDS. The Helly property of Hypergraph was exploited for the identification of the optimal feature subset and the arithmetic residue of the optimal feature subset was used to train the PNN. The performance of HG AR-PNN was evaluated using KDD CUP 1999 intrusion dataset. Experimental results prove the dominance of HG AR-PNN classifier over the existing classifiers with respect to the stability and improved detection rate for less frequent attacks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires

    Treesearch

    P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner

    2011-01-01

    The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...

  10. A probabilistic approach for shallow rainfall-triggered landslide modeling at basin scale. A case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.

  11. ATM phosphorylation of Mdm2 Ser394 regulates the amplitude and duration of the DNA damage response in mice

    PubMed Central

    Gannon, Hugh S.; Woda, Bruce A.; Jones, Stephen N.

    2012-01-01

    Summary DNA damage induced by ionizing radiation (IR) activates the ATM kinase, which subsequently stabilizes and activates the p53 tumor suppressor protein. Although phosphorylation of p53 by ATM was found previously to modulate p53 levels and transcriptional activities in vivo, it does not appear to be a major regulator of p53 stability. We have utilized mice bearing altered Mdm2 alleles to demonstrate that ATM phosphorylation of Mdm2 serine 394 is required for robust p53 stabilization and activation after DNA damage. In addition, we demonstrate that dephosphorylation of Mdm2 Ser394 regulates attenuation of the p53-mediated response to DNA damage. Therefore, the phosphorylation status of Mdm2 Ser394 governs p53 protein levels and functions in cells undergoing DNA damage. PMID:22624716

  12. Damage tolerance certification of a fighter horizontal stabilizer

    NASA Astrophysics Data System (ADS)

    Huang, Jia-Yen; Tsai, Ming-Yang; Chen, Jong-Sheng; Ong, Ching-Long

    1995-05-01

    A review of the program for the damage tolerance certification test of a composite horizontal stabilizer (HS) of a fighter is presented. The object of this program is to certify that the fatigue life and damage tolerance strength of a damaged composite horizontal stabilizer meets the design requirements. According to the specification for damage tolerance certification, a test article should be subjected to two design lifetimes of flight-by-flight load spectra simulating the in-service fatigue loading condition for the aircraft. However, considering the effect of environmental change on the composite structure, one additional lifetime test was performed. In addition, to evaluate the possibilities for extending the service life of the structure, one more lifetime test was carried out with the spectrum increased by a factor of 1.4. To assess the feasibility and reliability of repair technology on a composite structure, two damaged areas were repaired after two lifetimes of damage tolerance test. On completion of four lifetimes of the damage tolerance test, the static residual strength was measured to check whether structural strength after repair met the requirements. Stiffness and static strength of the composite HS with and without damage were evaluated and compared.

  13. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  14. The sampled-data consensus of multi-agent systems with probabilistic time-varying delays and packet losses

    NASA Astrophysics Data System (ADS)

    Sui, Xin; Yang, Yongqing; Xu, Xianyun; Zhang, Shuai; Zhang, Lingzhong

    2018-02-01

    This paper investigates the consensus of multi-agent systems with probabilistic time-varying delays and packet losses via sampled-data control. On the one hand, a Bernoulli-distributed white sequence is employed to model random packet losses among agents. On the other hand, a switched system is used to describe packet dropouts in a deterministic way. Based on the special property of the Laplacian matrix, the consensus problem can be converted into a stabilization problem of a switched system with lower dimensions. Some mean square consensus criteria are derived in terms of constructing an appropriate Lyapunov function and using linear matrix inequalities (LMIs). Finally, two numerical examples are given to show the effectiveness of the proposed method.

  15. THE LIQUEFACTION RISK ANALYSIS OF CEMENT-TREATED SANDY GROUND CONSIDERING THE SPATIAL VARIABILITY OF SOIL STRENGTH

    NASA Astrophysics Data System (ADS)

    Kataoka, Norio; Kasama, Kiyonobu; Zen, Kouki; Chen, Guangqi

    This paper presents a probabilistic method for assessi ng the liquefaction risk of cement-treated ground, which is an anti-liquefaction ground improved by cemen t-mixing. In this study, the liquefaction potential of cement-treated ground is analyzed statistically using Monte Carlo Simulation based on the nonlinear earthquake response analysis consid ering the spatial variability of so il properties. The seismic bearing capacity of partially liquefied ground is analyzed in order to estimat e damage costs induced by partial liquefaction. Finally, the annual li quefaction risk is calcu lated by multiplying the liquefaction potential with the damage costs. The results indicated that the proposed new method enables to evaluate the probability of liquefaction, to estimate the damage costs using the hazard curv e, fragility curve induced by liquefaction, and liq uefaction risk curve.

  16. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  17. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  18. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  19. Inspection Correlation Study of Ultrasonic-Based In Situ Structural Health Monitoring Monthly Report for December 2014-January 2015

    DTIC Science & Technology

    2015-05-01

    fatigue an induced ultrasonic elastic vibration (via piezoelectric transducers [ PZTs ]) propagates through the dogbone specimen. A receiver PZT picks up...inspection of fatigue crack growth in aluminum 7075-T6 dogbone specimens. Acellent Technologies, Inc., is supporting this project through providing...January 2015. 15. SUBJECT TERMS structural health monitoring, probabilistics, fatigue damage, guided waves, Lamb waves 16. SECURITY CLASSIFICATION OF

  20. Probabalistic Risk Assessment of a Turbine Disk

    NASA Astrophysics Data System (ADS)

    Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted

    Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.

  1. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  2. A probabilistic damage model of stress-induced permeability anisotropy during cataclastic flow

    NASA Astrophysics Data System (ADS)

    Zhu, Wenlu; MontéSi, Laurent G. J.; Wong, Teng-Fong

    2007-10-01

    A fundamental understanding of the effect of stress on permeability evolution is important for many fault mechanics and reservoir engineering problems. Recent laboratory measurements demonstrate that in the cataclastic flow regime, the stress-induced anisotropic reduction of permeability in porous rocks can be separated into 3 different stages. In the elastic regime (stage I), permeability and porosity reduction are solely controlled by the effective mean stress, with negligible permeability anisotropy. Stage II starts at the onset of shear-enhanced compaction, when a critical yield stress is attained. In stage II, the deviatoric stress exerts primary control over permeability and porosity evolution. The increase in deviatoric stress results in drastic permeability and porosity reduction and considerable permeability anisotropy. The transition from stage II to stage III takes place progressively during the development of pervasive cataclastic flow. In stage III, permeability and porosity reduction becomes gradual again, and permeability anisotropy diminishes. Microstructural observations on deformed samples using laser confocal microscopy reveal that stress-induced microcracking and pore collapse are the primary forms of damage during cataclastic flow. A probabilistic damage model is formulated to characterize the effects of stress on permeability and its anisotropy. In our model, the effects of both effective mean stress and differential stress on permeability evolution are calculated. By introducing stress sensitivity coefficients, we propose a first-order description of the dependence of permeability evolution on different loading paths. Built upon the micromechanisms of deformation in porous rocks, this unified model provides new insight into the coupling of stress and permeability.

  3. Reliability and Creep/Fatigue Analysis of a CMC Component

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; Gyekenyesi, John Z.; Gyekenyesi, John P.

    2007-01-01

    High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight and enable higher operating temperatures requiring less cooling; thus leading to increased engine efficiencies. There is a need for convenient design tools that can accommodate various loading conditions and material data with their associated uncertainties to estimate the minimum predicted life as well as the failure probabilities of a structural component. This paper presents a review of the life prediction and probabilistic analyses performed for a CMC turbine stator vane. A computer code, NASALife, is used to predict the life of a 2-D woven silicon carbide fiber reinforced silicon carbide matrix (SiC/SiC) turbine stator vane due to a mission cycle which induces low cycle fatigue and creep. The output from this program includes damage from creep loading, damage due to cyclic loading and the combined damage due to the given loading cycle. Results indicate that the trends predicted by NASALife are as expected for the loading conditions used for this study. In addition, a combination of woven composite micromechanics, finite element structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Results indicate that reducing the scatter in proportional limit strength of the vane material has the greatest effect in improving the overall reliability of the CMC vane.

  4. Developing a Malaysia flood model

    NASA Astrophysics Data System (ADS)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  5. Life prediction of different commercial dental implants as influence by uncertainties in their fatigue material properties and loading conditions.

    PubMed

    Pérez, M A

    2012-12-01

    Probabilistic analyses allow the effect of uncertainty in system parameters to be determined. In the literature, many researchers have investigated static loading effects on dental implants. However, the intrinsic variability and uncertainty of most of the main problem parameters are not accounted for. The objective of this research was to apply a probabilistic computational approach to predict the fatigue life of three different commercial dental implants considering the variability and uncertainty in their fatigue material properties and loading conditions. For one of the commercial dental implants, the influence of its diameter in the fatigue life performance was also studied. This stochastic technique was based on the combination of a probabilistic finite element method (PFEM) and a cumulative damage approach known as B-model. After 6 million of loading cycles, local failure probabilities of 0.3, 0.4 and 0.91 were predicted for the Lifecore, Avinent and GMI implants, respectively (diameter of 3.75mm). The influence of the diameter for the GMI implant was studied and the results predicted a local failure probability of 0.91 and 0.1 for the 3.75mm and 5mm, respectively. In all cases the highest failure probability was located at the upper screw-threads. Therefore, the probabilistic methodology proposed herein may be a useful tool for performing a qualitative comparison between different commercial dental implants. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  7. A probabilistic atlas of the cerebellar white matter.

    PubMed

    van Baarsen, K M; Kleinnijenhuis, M; Jbabdi, S; Sotiropoulos, S N; Grotenhuis, J A; van Cappellen van Walsum, A M

    2016-01-01

    Imaging of the cerebellar cortex, deep cerebellar nuclei and their connectivity are gaining attraction, due to the important role the cerebellum plays in cognition and motor control. Atlases of the cerebellar cortex and nuclei are used to locate regions of interest in clinical and neuroscience studies. However, the white matter that connects these relay stations is of at least similar functional importance. Damage to these cerebellar white matter tracts may lead to serious language, cognitive and emotional disturbances, although the pathophysiological mechanism behind it is still debated. Differences in white matter integrity between patients and controls might shed light on structure-function correlations. A probabilistic parcellation atlas of the cerebellar white matter would help these studies by facilitating automatic segmentation of the cerebellar peduncles, the localization of lesions and the comparison of white matter integrity between patients and controls. In this work a digital three-dimensional probabilistic atlas of the cerebellar white matter is presented, based on high quality 3T, 1.25mm resolution diffusion MRI data from 90 subjects participating in the Human Connectome Project. The white matter tracts were estimated using probabilistic tractography. Results over 90 subjects were symmetrical and trajectories of superior, middle and inferior cerebellar peduncles resembled the anatomy as known from anatomical studies. This atlas will contribute to a better understanding of cerebellar white matter architecture. It may eventually aid in defining structure-function correlations in patients with cerebellar disorders. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Passenger Vessel Damage Stability Study for 1990 SOLAS Amendments. Volume 2. Appendix B.

    DOT National Transportation Integrated Search

    1994-09-01

    The application of new damage stability requirements in the 1990 Safety of Life at Sea (SOLAS) amendments to the United States domestic passenger fleet is investigated. The amendments specify new minimums for positive range, righting energy, and down...

  9. Passenger vessel damage stability study for the 1990 SOLAS amendments, volume 1

    DOT National Transportation Integrated Search

    1994-09-01

    The application of new damage stability requirements in the 1990 Safety of Life at Sea (SOLAS) amendments to the United States domestic passenger fleet is investigated. The amendments specify new minimums for positive range, righting energy, and down...

  10. Analyzing the reliability of mechanical parts in 10 kV aerial transmission lines under ice-coating and wind effects in view of their design features

    NASA Astrophysics Data System (ADS)

    Doletskaya, L. I.; Solopov, R. V.; Kavchenkov, V. P.; Andreenkov, E. S.

    2017-12-01

    The physical features of the damage of aerial lines with a voltage of 10 kV under ice and wind loads are examined, mathematical models for estimating the reliability the mechanical part in aerial lines with the application of analytical theoretical methods and corresponding mathematical models taking into account the probabilistic nature of ice and wind loads are described, calculation results on reliability, specific damage and average time for restoration in case of emergency outages of 10 kV high-voltage transmission aerial lines with the use of uninsulated and protected wires are presented.

  11. Characterizing Fracturing of Clay-Rich Lower Watrous Rock: From Laboratory Experiments to Nonlocal Damage-Based Simulations

    NASA Astrophysics Data System (ADS)

    Guy, N.; Seyedi, D. M.; Hild, F.

    2018-06-01

    The work presented herein aims at characterizing and modeling fracturing (i.e., initiation and propagation of cracks) in a clay-rich rock. The analysis is based on two experimental campaigns. The first one relies on a probabilistic analysis of crack initiation considering Brazilian and three-point flexural tests. The second one involves digital image correlation to characterize crack propagation. A nonlocal damage model based on stress regularization is used for the simulations. Two thresholds both based on regularized stress fields are considered. They are determined from the experimental campaigns performed on Lower Watrous rock. The results obtained with the proposed approach are favorably compared with the experimental results.

  12. A Bayesian state-space approach for damage detection and classification

    NASA Astrophysics Data System (ADS)

    Dzunic, Zoran; Chen, Justin G.; Mobahi, Hossein; Büyüköztürk, Oral; Fisher, John W.

    2017-11-01

    The problem of automatic damage detection in civil structures is complex and requires a system that can interpret collected sensor data into meaningful information. We apply our recently developed switching Bayesian model for dependency analysis to the problems of damage detection and classification. The model relies on a state-space approach that accounts for noisy measurement processes and missing data, which also infers the statistical temporal dependency between measurement locations signifying the potential flow of information within the structure. A Gibbs sampling algorithm is used to simultaneously infer the latent states, parameters of the state dynamics, the dependence graph, and any changes in behavior. By employing a fully Bayesian approach, we are able to characterize uncertainty in these variables via their posterior distribution and provide probabilistic estimates of the occurrence of damage or a specific damage scenario. We also implement a single class classification method which is more realistic for most real world situations where training data for a damaged structure is not available. We demonstrate the methodology with experimental test data from a laboratory model structure and accelerometer data from a real world structure during different environmental and excitation conditions.

  13. Baseline-free damage detection in composite plates based on the reciprocity principle

    NASA Astrophysics Data System (ADS)

    Huang, Liping; Zeng, Liang; Lin, Jing

    2018-01-01

    Lamb wave based damage detection techniques have been widely used in composite structures. In particular, these techniques usually rely on reference signals, which are significantly influenced by the operational and environmental conditions. To solve this issue, this paper presents a baseline-free damage inspection method based on the reciprocity principle. If a localized nonlinear scatterer exists along the wave path, the reciprocity breaks down. Through estimating the loss of reciprocity, the delamination could be detected. A reciprocity index (RI), which compares the discrepancy between the signal received in transducer B when emitting from transducer A and the signal received in A when the same source is located in B, is established to quantitatively analyze the reciprocity. Experimental results show that the RI value of a damaged path is much higher than that of a healthy path. In addition, the effects of the parameters of excitation signal (i.e., central frequency and bandwidth) and the position of delamination on the RI value are discussed. Furthermore, a RI based probabilistic imaging algorithm is proposed for detecting delamination damage of composite plates without reference signals. Finally, the effectiveness of this baseline-free damage detection method is validated by an experimental example.

  14. Stability and Multiattractor Dynamics of a Toggle Switch Based on a Two-Stage Model of Stochastic Gene Expression

    PubMed Central

    Strasser, Michael; Theis, Fabian J.; Marr, Carsten

    2012-01-01

    A toggle switch consists of two genes that mutually repress each other. This regulatory motif is active during cell differentiation and is thought to act as a memory device, being able to choose and maintain cell fate decisions. Commonly, this switch has been modeled in a deterministic framework where transcription and translation are lumped together. In this description, bistability occurs for transcription factor cooperativity, whereas autoactivation leads to a tristable system with an additional undecided state. In this contribution, we study the stability and dynamics of a two-stage gene expression switch within a probabilistic framework inspired by the properties of the Pu/Gata toggle switch in myeloid progenitor cells. We focus on low mRNA numbers, high protein abundance, and monomeric transcription-factor binding. Contrary to the expectation from a deterministic description, this switch shows complex multiattractor dynamics without autoactivation and cooperativity. Most importantly, the four attractors of the system, which only emerge in a probabilistic two-stage description, can be identified with committed and primed states in cell differentiation. To begin, we study the dynamics of the system and infer the mechanisms that move the system between attractors using both the quasipotential and the probability flux of the system. Next, we show that the residence times of the system in one of the committed attractors are geometrically distributed. We derive an analytical expression for the parameter of the geometric distribution, therefore completely describing the statistics of the switching process and elucidate the influence of the system parameters on the residence time. Moreover, we find that the mean residence time increases linearly with the mean protein level. This scaling also holds for a one-stage scenario and for autoactivation. Finally, we study the implications of this distribution for the stability of a switch and discuss the influence of the stability on a specific cell differentiation mechanism. Our model explains lineage priming and proposes the need of either high protein numbers or long-term modifications such as chromatin remodeling to achieve stable cell fate decisions. Notably, we present a system with high protein abundance that nevertheless requires a probabilistic description to exhibit multistability, complex switching dynamics, and lineage priming. PMID:22225794

  15. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  16. Study of VLCC tanker ship damage stability during off-shore operation

    NASA Astrophysics Data System (ADS)

    Hanzu-Pazara, R.; Arsenie, P.; Duse, A.; Varsami, C.

    2016-08-01

    Today, for the carriage of crude oil on sea are used larger tanker ships, especially from VLCC class. The operation of this type of ships requires in many cases special conditions, mainly related to water depth in the terminal area and enough maneuvering space for entrance and departure. Because, many ports from all over the world don't have capacity to operate this type of ships inside, in designed oil terminal, have chosen for development of outside terminals, off-shore oil terminals. In case of this type of terminals, the problems of water depth and manoeuvring space are fixed, but other kind of situations appears, regarding the safety in operation and environment factors impact on ship during mooring at oil transfer buoy. In the present paper we intend to show a study made using simulation techniques about VLCC class tanker ship in case of a damage condition resulted after a possible collision with another ship during loading operation at an off-shore terminal. From the beginning, we take in consideration that the ship intact stability, during all loading possible situations, has to be high enough, so that in case of some damage with flooding of different compartments due to hypothetical dimension water hole, the ship stability in the final stage of flooding to correspond to the requirements for damage stability and, also, to complementary requirements for damage ship stability.

  17. Damage prognosis: the future of structural health monitoring.

    PubMed

    Farrar, Charles R; Lieven, Nick A J

    2007-02-15

    This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP). DP attempts to forecast system performance by assessing the current damage state of the system (i.e. SHM), estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system. The successful development of a DP capability will require the further development and integration of many technology areas including both measurement/processing/telemetry hardware and a variety of deterministic and probabilistic predictive modelling capabilities, as well as the ability to quantify the uncertainty in these predictions. The multidisciplinary and challenging nature of the DP problem, its current embryonic state of development, and its tremendous potential for life-safety and economic benefits qualify DP as a 'grand challenge' problem for engineers in the twenty-first century.

  18. Coupling of Bayesian Networks with GIS for wildfire risk assessment on natural and agricultural areas of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Scherb, Anke; Papakosta, Panagiota; Straub, Daniel

    2014-05-01

    Wildfires cause severe damages to ecosystems, socio-economic assets, and human lives in the Mediterranean. To facilitate coping with wildfire risks, an understanding of the factors influencing wildfire occurrence and behavior (e.g. human activity, weather conditions, topography, fuel loads) and their interaction is of importance, as is the implementation of this knowledge in improved wildfire hazard and risk prediction systems. In this project, a probabilistic wildfire risk prediction model is developed, with integrated fire occurrence and fire propagation probability and potential impact prediction on natural and cultivated areas. Bayesian Networks (BNs) are used to facilitate the probabilistic modeling. The final BN model is a spatial-temporal prediction system at the meso scale (1 km2 spatial and 1 day temporal resolution). The modeled consequences account for potential restoration costs and production losses referred to forests, agriculture, and (semi-) natural areas. BNs and a geographic information system (GIS) are coupled within this project to support a semi-automated BN model parameter learning and the spatial-temporal risk prediction. The coupling also enables the visualization of prediction results by means of daily maps. The BN parameters are learnt for Cyprus with data from 2006-2009. Data from 2010 is used as validation data set. A special focus is put on the performance evaluation of the BN for fire occurrence, which is modeled as binary classifier and thus, could be validated by means of Receiver Operator Characteristic (ROC) curves. With the final best models, AUC values of more than 70% for validation could be achieved, which indicates potential for reliable prediction performance via BN. Maps of selected days in 2010 are shown to illustrate final prediction results. The resulting system can be easily expanded to predict additional expected damages in the mesoscale (e.g. building and infrastructure damages). The system can support planning of preventive measures (e.g. state resources allocation for wildfire prevention and preparedness) and assist recuperation plans of damaged areas.

  19. 76 FR 13072 - Airworthiness Directives; Saab AB, Saab Aerosystems Model SAAB 2000 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ... important to the structural integrity of the horizontal stabilizer. Corrosion damage in these areas, if not... structural integrity of the horizontal stabilizer. Corrosion damage in these areas, if not detected and... convoluted tubing on the harness, applying corrosion prevention compound to the inspected area, making sure...

  20. 75 FR 77796 - Airworthiness Directives; Saab AB, Saab Aerosystems Model SAAB 2000 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ... of the horizontal stabilizer. Corrosion damage in these areas, if not detected and corrected, can... of the horizontal stabilizer. Corrosion damage in these areas, if not detected and corrected, can... convoluted tubing on the harness, applying corrosion prevention compound to the inspected area, making sure...

  1. 33 CFR 155.245 - Damage stability information for inland oil barges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Damage stability information for inland oil barges. 155.245 Section 155.245 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION PREVENTION REGULATIONS FOR...

  2. 33 CFR 155.245 - Damage stability information for inland oil barges.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Damage stability information for inland oil barges. 155.245 Section 155.245 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION PREVENTION REGULATIONS FOR...

  3. 33 CFR 155.240 - Damage stability information for oil tankers and offshore oil barges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Damage stability information for oil tankers and offshore oil barges. 155.240 Section 155.240 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION...

  4. 33 CFR 155.240 - Damage stability information for oil tankers and offshore oil barges.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Damage stability information for oil tankers and offshore oil barges. 155.240 Section 155.240 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION...

  5. 33 CFR 157.21 - Subdivision and stability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... vessel must meet the following subdivision and damage stability criteria after assuming side and bottom damages, as defined in appendix B of this part. A U.S. vessel that meets the requirements in this section... account sinkage, heel, and trim, must be below the lower edge of an opening through which progressive...

  6. 33 CFR 157.21 - Subdivision and stability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... vessel must meet the following subdivision and damage stability criteria after assuming side and bottom damages, as defined in appendix B of this part. A U.S. vessel that meets the requirements in this section... account sinkage, heel, and trim, must be below the lower edge of an opening through which progressive...

  7. 33 CFR 157.21 - Subdivision and stability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... vessel must meet the following subdivision and damage stability criteria after assuming side and bottom damages, as defined in appendix B of this part. A U.S. vessel that meets the requirements in this section... account sinkage, heel, and trim, must be below the lower edge of an opening through which progressive...

  8. 33 CFR 155.240 - Damage stability information for oil tankers and offshore oil barges.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Damage stability information for oil tankers and offshore oil barges. 155.240 Section 155.240 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION...

  9. 33 CFR 155.245 - Damage stability information for inland oil barges.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Damage stability information for inland oil barges. 155.245 Section 155.245 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) POLLUTION OIL OR HAZARDOUS MATERIAL POLLUTION PREVENTION REGULATIONS FOR...

  10. Tracking composite material damage evolution using Bayesian filtering and flash thermography data

    NASA Astrophysics Data System (ADS)

    Gregory, Elizabeth D.; Holland, Steve D.

    2016-05-01

    We propose a method for tracking the condition of a composite part using Bayesian filtering of ash thermography data over the lifetime of the part. In this demonstration, composite panels were fabricated; impacted to induce subsurface delaminations; and loaded in compression over multiple time steps, causing the delaminations to grow in size. Flash thermography data was collected between each damage event to serve as a time history of the part. The ash thermography indicated some areas of damage but provided little additional information as to the exact nature or depth of the damage. Computed tomography (CT) data was also collected after each damage event and provided a high resolution volume model of damage that acted as truth. After each cycle, the condition estimate, from the ash thermography data and the Bayesian filter, was compared to 'ground truth'. The Bayesian process builds on the lifetime history of ash thermography scans and can give better estimates of material condition as compared to the most recent scan alone, which is common practice in the aerospace industry. Bayesian inference provides probabilistic estimates of damage condition that are updated as each new set of data becomes available. The method was tested on simulated data and then on an experimental data set.

  11. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  12. Localization and stability in damageable amorphous solids

    NASA Astrophysics Data System (ADS)

    de Tommasi, D.; Marzano, S.; Puglisi, G.; Saccomandi, G.

    2010-01-01

    In the present article, based on a recently proposed model (De Tommasi et al. in J Rheol 50:495-512, 2006; Phys Rev Lett 100:085502, 2008), we analyze the influence of the microstructure properties on the damage behavior of amorphous materials. In accordance with the experimental observations, different scenarios of damage nucleation and evolution are associated to different material distributions at the microscale. In particular, we observe the possibilities of uniform or localized damage and strain geometries with a macroscopic behavior that may range from brittle to ductile or rubber-like. To describe the possibility of extending our stability analysis to three-dimensional damageable amorphous bodies we consider a simple boundary value problem of engineering interest.

  13. Stability Study on Steel Structural Columns with Initial Blast Damage under High Temperatures

    NASA Astrophysics Data System (ADS)

    Baoxin, Qi; Yan, Shi; Li, Peng

    2018-03-01

    Blast may bring light-weight steel columns with initial damages, resulting in lowering its critical fire-resistance temperature whose reduced amplitude is relevant to the form and degree of the damages. Finite element analysis software ANSYS was used in the paper to analyze the issue of the fire-resistance temperature of the column with the blast damages, and the coupling method for heat and structure was applied during the simulation. The emphasis was laid on parametric factors of axial compression ratio, the form and the degree of the initial damages, as well as the confined condition at the ends of the columns. The numerical results showed that the fire-resistance temperature will lower as increasing of the axial compression ratio, the form and the degree of the initial damages and it will be also affected by the restraint conditions at the ends of the columns. The critical stress formula with initial bending damage under elevated temperature was set up under flexural small deformation condition, then the stability coefficient was determined and the method for evaluating the limit temperature of the column was put forward. The theoretical result was also compared with that of the finite element method (FEM). The results both showed that the stability capacity for the damaged columns was dramatically reduced as increasing the temperature and the initial damage level.

  14. Maintenance of Genome Stability and Breast Cancer: Molecular Analysis of DNA Damage-Activated Kinases

    DTIC Science & Technology

    2008-03-01

    Breast Cancer: Molecular Analysis of DNA Damage-Activated Kinases PRINCIPAL INVESTIGATOR: Daniel Mordes...Maintenance of Genome Stability and Breast Cancer: Molecular Analysis of DNA Damage-Activated Kinases 5b. GRANT NUMBER W81XWH-06-1-0352 5c...shown that this domain of Dpb11 stimulates the kinase activity of wild-type Mec1-Ddc2 yet did not simulate Mec1-ddc2-top. Thus, we have demonstrated

  15. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  16. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  17. 46 CFR 173.055 - Watertight subdivision and damage stability standards for existing sailing school vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... for existing sailing school vessels. 173.055 Section 173.055 Shipping COAST GUARD, DEPARTMENT OF... § 173.055 Watertight subdivision and damage stability standards for existing sailing school vessels. (a) Except as provided in paragraph (c) of this section, an existing sailing school vessel which carries more...

  18. 46 CFR 173.055 - Watertight subdivision and damage stability standards for existing sailing school vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for existing sailing school vessels. 173.055 Section 173.055 Shipping COAST GUARD, DEPARTMENT OF... § 173.055 Watertight subdivision and damage stability standards for existing sailing school vessels. (a) Except as provided in paragraph (c) of this section, an existing sailing school vessel which carries more...

  19. Findings of a review of spacecraft fire safety needs

    NASA Technical Reports Server (NTRS)

    Apostolakis, G. E.; Catton, I.; Paulos, T.; Paxton, K.; Jones, S.

    1992-01-01

    Discussions from a workshop to guide UCLA and NASA investigators on the state of knowledge and perceived needs in spacecraft fire safety and its risk management are reviewed, for an introduction to an analytical and experimental project in this field. The report summarizes the workshop discussions and includes the visual aids used in the presentations. Probabilistic Safety Assessment (PSA) methods, which are currently not used, would be of great value to the designs and operation of future human-crew spacecraft. Key points in the discussions were the importance of understanding and testing smoldering as a likely fire scenario in space and the need for smoke damage modeling, since many fire-risk models ignore this mechanism and consider only heat damage.

  20. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  1. Adaptive Decision Making Using Probabilistic Programming and Stochastic Optimization

    DTIC Science & Technology

    2018-01-01

    world optimization problems (and hence 16 Approved for Public Release (PA); Distribution Unlimited Pred. demand (uncertain; discrete ...simplify the setting, we further assume that the demands are discrete , taking on values d1, . . . , dk with probabilities (conditional on x) (pθ)i ≡ p...Tyrrell Rockafellar. Implicit functions and solution mappings. Springer Monogr. Math ., 2009. Anthony V Fiacco and Yo Ishizuka. Sensitivity and stability

  2. Evaluation of moisture damage in asphalt concrete with CRM motorcycle tire waste passing #50 sieve size

    NASA Astrophysics Data System (ADS)

    Siswanto, Henri; Supriyanto, Bambang; Pranoto, Pranoto; Chandra, Pria Rizky; Hakim, Arief Rahman

    2017-09-01

    The objective of this experimental research is to evaluate moisture damage in Asphalt Concrete (AC) with Crumb Rubber Modified (CRM) motorcycle tire waste passing #50 and retaining #100 sieve size. Two gradations were used in this research, the first gradation is usual for asphalt concrete base (ACB) and the second gradation is for asphalt concrete wearing course (ACWC). Marshall testing apparatus was used for testing the Marshall specimens. Seven levels of CRM content were used, namely 0%, 0.5%, 1%, 1.5%, 3%, 4.5% and 6% by weight of mixtures. Retained stability represent the level of moisture damage of AC pavement. The result indicates that addition CRM to the AC mixture increases their the stability to a maximum value and subsequent addition decrease the stability. The addition CRM to AC decreases their moisture damage susceptibility. AC with 1% CRM is the best asphalt-CRM mix.

  3. Active Faults and Earthquake Hazards in the FY 79 Verification Sites - Nevada-Utah Siting Region.

    DTIC Science & Technology

    1980-03-26

    structures, such as shelters and command/control facilities, away from rup- ture hazards. Again, the probability of rupture, the effect of damage and ...accommodate an MCE, and less critical structures (such as the shelters ) designed for a probabilistically determined event, may have merit for the MX...B., and Eaton, G. P., eds., Cenozoic tectonics and regional geophysics of the western cordillera : Geol. Soc. Am. Mem. 152, p. 1-32. Stewart, J. H

  4. Fracture mechanics methodology: Evaluation of structural components integrity

    NASA Astrophysics Data System (ADS)

    Sih, G. C.; de Oliveira Faria, L.

    1984-09-01

    The application of fracture mechanics to structural-design problems is discussed in lectures presented in the AGARD Fracture Mechanics Methodology course held in Lisbon, Portugal, in June 1981. The emphasis is on aeronautical design, and chapters are included on fatigue-life prediction for metals and composites, the fracture mechanics of engineering structural components, failure mechanics and damage evaluation of structural components, flaw-acceptance methods, and reliability in probabilistic design. Graphs, diagrams, drawings, and photographs are provided.

  5. Frost risk for overwintering crops in a changing climate

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Weih, Martin

    2013-04-01

    Climate change scenarios predict a general increase in daily temperatures and a decline in snow cover duration. On the one hand, higher temperature in fall and spring may facilitate the development of overwintering crops and allow the expansion of winter cropping in locations where the growing season is currently too short. On the other hand, higher temperatures prior to winter crop dormancy slow down frost hardening, enhancing crop vulnerability to temperature fluctuation. Such vulnerability may be exacerbated by reduced snow cover, with potential further negative impacts on yields in extremely low temperatures. We propose a parsimonious probabilistic model to quantify the winter frost damage risk for overwintering crops, based on a coupled model of air temperature, snow cover, and crop minimum tolerable temperature. The latter is determined by crop features, previous history of temperature, and snow cover. The temperature-snow cover model is tested against meteorological data collected over 50 years in Sweden and applied to winter wheat varieties differing in their ability to acquire frost resistance. Hence, exploiting experimental results assessing crop frost damage under limited temperature and snow cover realizations, this probabilistic framework allows the quantification of frost risk for different crop varieties, including in full temperature and precipitation unpredictability. Climate change scenarios are explored to quantify the effects of changes in temperature mean and variance and precipitation regime over crops differing in winter frost resistance and response to temperature.

  6. Medium Range Flood Forecasting for Agriculture Damage Reduction

    NASA Astrophysics Data System (ADS)

    Fakhruddin, S. H. M.

    2014-12-01

    Early warning is a key element for disaster risk reduction. In recent decades, major advancements have been made in medium range and seasonal flood forecasting. This progress provides a great opportunity to reduce agriculture damage and improve advisories for early action and planning for flood hazards. This approach can facilitate proactive rather than reactive management of the adverse consequences of floods. In the agricultural sector, for instance, farmers can take a diversity of options such as changing cropping patterns, applying fertilizer, irrigating and changing planting timing. An experimental medium range (1-10 day) flood forecasting model has been developed for Bangladesh and Thailand. It provides 51 sets of discharge ensemble forecasts of 1-10 days with significant persistence and high certainty. This type of forecast could assist farmers and other stakeholders for differential preparedness activities. These ensembles probabilistic flood forecasts have been customized based on user-needs for community-level application focused on agriculture system. The vulnerabilities of agriculture system were calculated based on exposure, sensitivity and adaptive capacity. Indicators for risk and vulnerability assessment were conducted through community consultations. The forecast lead time requirement, user-needs, impacts and management options for crops were identified through focus group discussions, informal interviews and community surveys. This paper illustrates potential applications of such ensembles for probabilistic medium range flood forecasts in a way that is not commonly practiced globally today.

  7. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  8. Reliable Cellular Automata with Self-Organization

    NASA Astrophysics Data System (ADS)

    Gács, Peter

    2001-04-01

    In a probabilistic cellular automaton in which all local transitions have positive probability, the problem of keeping a bit of information indefinitely is nontrivial, even in an infinite automaton. Still, there is a solution in 2 dimensions, and this solution can be used to construct a simple 3-dimensional discrete-time universal fault-tolerant cellular automaton. This technique does not help much to solve the following problems: remembering a bit of information in 1 dimension; computing in dimensions lower than 3; computing in any dimension with non-synchronized transitions. Our more complex technique organizes the cells in blocks that perform a reliable simulation of a second (generalized) cellular automaton. The cells of the latter automaton are also organized in blocks, simulating even more reliably a third automaton, etc. Since all this (a possibly infinite hierarchy) is organized in "software," it must be under repair all the time from damage caused by errors. A large part of the problem is essentially self-stabilization recovering from a mess of arbitrary size and content. The present paper constructs an asynchronous one-dimensional fault-tolerant cellular automaton, with the further feature of "self-organization." The latter means that unless a large amount of input information must be given, the initial configuration can be chosen homogeneous.

  9. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  10. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  11. Probabilistic Reversal Learning in Schizophrenia: Stability of Deficits and Potential Causal Mechanisms.

    PubMed

    Reddy, Lena Felice; Waltz, James A; Green, Michael F; Wynn, Jonathan K; Horan, William P

    2016-07-01

    Although individuals with schizophrenia show impaired feedback-driven learning on probabilistic reversal learning (PRL) tasks, the specific factors that contribute to these deficits remain unknown. Recent work has suggested several potential causes including neurocognitive impairments, clinical symptoms, and specific types of feedback-related errors. To examine this issue, we administered a PRL task to 126 stable schizophrenia outpatients and 72 matched controls, and patients were retested 4 weeks later. The task involved an initial probabilistic discrimination learning phase and subsequent reversal phases in which subjects had to adjust their responses to sudden shifts in the reinforcement contingencies. Patients showed poorer performance than controls for both the initial discrimination and reversal learning phases of the task, and performance overall showed good test-retest reliability among patients. A subgroup analysis of patients (n = 64) and controls (n = 49) with good initial discrimination learning revealed no between-group differences in reversal learning, indicating that the patients who were able to achieve all of the initial probabilistic discriminations were not impaired in reversal learning. Regarding potential contributors to impaired discrimination learning, several factors were associated with poor PRL, including higher levels of neurocognitive impairment, poor learning from both positive and negative feedback, and higher levels of indiscriminate response shifting. The results suggest that poor PRL performance in schizophrenia can be the product of multiple mechanisms. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. Transient-Free Operations With Physics-Based Real-time Analysis and Control

    NASA Astrophysics Data System (ADS)

    Kolemen, Egemen; Burrell, Keith; Eggert, William; Eldon, David; Ferron, John; Glasser, Alex; Humphreys, David

    2016-10-01

    In order to understand and predict disruptions, the two most common methods currently employed in tokamak analysis are the time-consuming ``kinetic EFITs,'' which are done offline with significant human involvement, and the search for correlations with global precursors using various parameterization techniques. We are developing automated ``kinetic EFITs'' at DIII-D to enable calculation of the stability as the plasma evolves close to the disruption. This allows us to quantify the probabilistic nature of the stability calculations and provides a stability metric for all possible linear perturbations to the plasma. This study also provides insight into how the control system can avoid the unstable operating space, which is critical for high-performance operations close to stability thresholds at ITER. A novel, efficient ideal stability calculation method and new real-time CER acquisition system are being developed, and a new 77-core server has been installed on the DIII-D PCS to enable experimental use. Sponsored by US DOE under DE-SC0015878 and DE-FC02-04ER54698.

  13. 46 CFR 173.054 - Watertight subdivision and damage stability standards for new sailing school vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... for new sailing school vessels. 173.054 Section 173.054 Shipping COAST GUARD, DEPARTMENT OF HOMELAND....054 Watertight subdivision and damage stability standards for new sailing school vessels. (a) Each new sailing school vessel which has a mean length greater than 75 feet (22.8 meters) or which carries more...

  14. 46 CFR 173.054 - Watertight subdivision and damage stability standards for new sailing school vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for new sailing school vessels. 173.054 Section 173.054 Shipping COAST GUARD, DEPARTMENT OF HOMELAND....054 Watertight subdivision and damage stability standards for new sailing school vessels. (a) Each new sailing school vessel which has a mean length greater than 75 feet (22.8 meters) or which carries more...

  15. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  16. Multiple Damage Progression Paths in Model-Based Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Goebel, Kai Frank

    2011-01-01

    Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active

  17. Damage assessment of composite plate structures with material and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  18. A novel regulation mechanism of DNA repair by damage-induced and RAD23-dependent stabilization of xeroderma pigmentosum group C protein

    PubMed Central

    Ng, Jessica M.Y.; Vermeulen, Wim; van der Horst, Gijsbertus T.J.; Bergink, Steven; Sugasawa, Kaoru; Vrieling, Harry; Hoeijmakers, Jan H.J.

    2003-01-01

    Primary DNA damage sensing in mammalian global genome nucleotide excision repair (GG-NER) is performed by the xeroderma pigmentosum group C (XPC)/HR23B protein complex. HR23B and HR23A are human homologs of the yeast ubiquitin-domain repair factor RAD23, the function of which is unknown. Knockout mice revealed that mHR23A and mHR23B have a fully redundant role in NER, and a partially redundant function in embryonic development. Inactivation of both genes causes embryonic lethality, but appeared still compatible with cellular viability. Analysis of mHR23A/B double-mutant cells showed that HR23 proteins function in NER by governing XPC stability via partial protection against proteasomal degradation. Interestingly, NER-type DNA damage further stabilizes XPC and thereby enhances repair. These findings resolve the primary function of RAD23 in repair and reveal a novel DNA-damage-dependent regulation mechanism of DNA repair in eukaryotes, which may be part of a more global damage-response circuitry. PMID:12815074

  19. Quinacrine pretreatment reduces microwave-induced neuronal damage by stabilizing the cell membrane

    PubMed Central

    Ding, Xue-feng; Wu, Yan; Qu, Wen-rui; Fan, Ming; Zhao, Yong-qi

    2018-01-01

    Quinacrine, widely used to treat parasitic diseases, binds to cell membranes. We previously found that quinacrine pretreatment reduced microwave radiation damage in rat hippocampal neurons, but the molecular mechanism remains poorly understood. Considering the thermal effects of microwave radiation and the protective effects of quinacrine on heat damage in cells, we hypothesized that quinacrine would prevent microwave radiation damage to cells in a mechanism associated with cell membrane stability. To test this, we used retinoic acid to induce PC12 cells to differentiate into neuron-like cells. We then pretreated the neurons with quinacrine (20 and 40 mM) and irradiated them with 50 mW/cm2 microwaves for 3 or 6 hours. Flow cytometry, atomic force microscopy and western blot assays revealed that irradiated cells pretreated with quinacrine showed markedly less apoptosis, necrosis, and membrane damage, and greater expression of heat shock protein 70, than cells exposed to microwave irradiation alone. These results suggest that quinacrine stabilizes the neuronal membrane structure by upregulating the expression of heat shock protein 70, thus reducing neuronal injury caused by microwave radiation. PMID:29623929

  20. Spatially Informed Plant PRA Models for Security Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, Timothy A.; Thomas, Willard; Thornsbury, Eric

    2006-07-01

    Traditional risk models can be adapted to evaluate plant response for situations where plant systems and structures are intentionally damaged, such as from sabotage or terrorism. This paper describes a process by which traditional risk models can be spatially informed to analyze the effects of compound and widespread harsh environments through the use of 'damage footprints'. A 'damage footprint' is a spatial map of regions of the plant (zones) where equipment could be physically destroyed or disabled as a direct consequence of an intentional act. The use of 'damage footprints' requires that the basic events from the traditional probabilistic riskmore » assessment (PRA) be spatially transformed so that the failure of individual components can be linked to the destruction of or damage to specific spatial zones within the plant. Given the nature of intentional acts, extensive modifications must be made to the risk models to account for the special nature of the 'initiating events' associated with deliberate adversary actions. Intentional acts might produce harsh environments that in turn could subject components and structures to one or more insults, such as structural, fire, flood, and/or vibration and shock damage. Furthermore, the potential for widespread damage from some of these insults requires an approach that addresses the impacts of these potentially severe insults even when they occur in locations distant from the actual physical location of a component or structure modeled in the traditional PRA. (authors)« less

  1. The influence of bone damage on press-fit mechanics.

    PubMed

    Bishop, Nicholas E; Höhn, Jan-Christian; Rothstock, Stephan; Damm, Niklas B; Morlock, Michael M

    2014-04-11

    Press-fitting is used to anchor uncemented implants in bone. It relies in part on friction resistance to relative motion at the implant-bone interface to allow bone ingrowth and long-term stability. Frictional shear capacity is related to the interference fit of the implant and the roughness of its surface. It was hypothesised here that a rough implant could generate trabecular bone damage during implantation, which would reduce its stability. A device was constructed to simulate implantation by displacement of angled platens with varying surface finishes (polished, beaded and flaked) onto the surface of an embedded trabecular bone cube, to different nominal interferences. Push-in (implantation) and Pull-out forces were measured and micro-CT scans were made before and after testing to assess permanent bone deformation. Depth of permanent trabecular bone deformation ('damage'), Pull-out force and Radial force all increased with implantation displacement and with implantation force, for all surface roughnesses. The proposed hypothesis was rejected, since primary stability did not decrease with trabecular bone damage. In fact, Pull-out force linearly increased with push-in force, independently of trabecular bone damage or implant surface. This similar behaviour for the different surfaces might be explained by the compaction of bone into the surfaces during push-in so that Pull-out resistance is governed by bone-on-bone, rather than implant surface-on-bone friction. The data suggest that maximum stability is achieved for the maximum implantation force possible (regardless of trabecular bone damage or surface roughness), but this must be limited to prevent periprosthetic cortical bone fracture, patient damage and component malpositioning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Laser damage initiation and growth of antireflection coated S-FAP crystal surfaces prepared by pitch lap and magnetorheological finishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolz, C J; Menapace, J A; Schaffers, K I

    Antireflection (AR) coatings typically damage at the interface between the substrate and coating. Therefore the substrate finishing technology can have an impact on the laser resistance of the coating. For this study, AR coatings were deposited on Yb:S-FAP [Yb{sup 3+}:Sr{sub 5}(PO{sub 4}){sub 3}F] crystals that received a final polish by both conventional pitch lap finishing as well as magnetorheological finishing (MRF). SEM images of the damage morphology reveals laser damage originates at scratches and at substrate coating interfacial absorbing defects. Previous damage stability tests on multilayer mirror coatings and bare surfaces revealed damage growth can occur at fluences below themore » initiation fluence. The results from this study suggest the opposite trend for AR coatings. Investigation of unstable HR and uncoated surface damage morphologies reveals significant radial cracking that is not apparent with AR damage due to AR delamination from the coated surface with few apparent cracks at the damage boundary. Damage stability tests show that coated Yb:S-FAP crystals can operate at 1057 nm at fluences around 20 J/cm{sup 2} at 10 ns; almost twice the initiation damage threshold.« less

  3. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.

  4. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  5. Infrared maritime target detection using a probabilistic single Gaussian model of sea clutter in Fourier domain

    NASA Astrophysics Data System (ADS)

    Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei

    2018-02-01

    For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.

  6. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  7. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  8. Resilient Grid Operational Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasqualini, Donatella

    Extreme weather-related disturbances, such as hurricanes, are a leading cause of grid outages historically. Although physical asset hardening is perhaps the most common way to mitigate the impacts of severe weather, operational strategies may be deployed to limit the extent of societal and economic losses associated with weather-related physical damage.1 The purpose of this study is to examine bulk power-system operational strategies that can be deployed to mitigate the impact of severe weather disruptions caused by hurricanes, thereby increasing grid resilience to maintain continuity of critical infrastructure during extreme weather. To estimate the impacts of resilient grid operational strategies, Losmore » Alamos National Laboratory (LANL) developed a framework for hurricane probabilistic risk analysis (PRA). The probabilistic nature of this framework allows us to estimate the probability distribution of likely impacts, as opposed to the worst-case impacts. The project scope does not include strategies that are not operations related, such as transmission system hardening (e.g., undergrounding, transmission tower reinforcement and substation flood protection) and solutions in the distribution network.« less

  9. Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.

  10. Seismic safety assessment of unreinforced masonry low-rise buildings in Pakistan and its neighbourhood

    NASA Astrophysics Data System (ADS)

    Korkmaz, K. A.

    2009-06-01

    Pakistan and neighbourhood experience numerous earthquakes, most of which result in damaged or collapsed buildings and loss of life that also affect the economy adversely. On 29 October, 2008, an earthquake of magnitude 6.5 occurred in Ziarat, Quetta Region, Pakistan which was followed by more than 400 aftershocks. Many villages were completely destroyed and more than 200 people died. The previous major earthquake was in 2005, known as the South Asian earthquake (Mw=7.6) occurred in Kashmir, where 80 000 people died. Inadequate building stock is to be blamed for the degree of disaster, as the majority of the buildings in the region are unreinforced masonry low-rise buildings. In this study, seismic vulnerability of regionally common unreinforced masonry low-rise buildings was investigated using probabilistic based seismic safety assessment. The results of the study showed that unreinforced masonry low-rise buildings display higher displacements and shear force. Probability of damage due to higher displacements and shear forces can be directly related to damage or collapse.

  11. Visuo-motor and cognitive procedural learning in children with basal ganglia pathology.

    PubMed

    Mayor-Dubois, C; Maeder, P; Zesiger, P; Roulet-Perez, E

    2010-06-01

    We investigated procedural learning in 18 children with basal ganglia (BG) lesions or dysfunctions of various aetiologies, using a visuo-motor learning test, the Serial Reaction Time (SRT) task, and a cognitive learning test, the Probabilistic Classification Learning (PCL) task. We compared patients with early (<1 year old, n=9), later onset (>6 years old, n=7) or progressive disorder (idiopathic dystonia, n=2). All patients showed deficits in both visuo-motor and cognitive domains, except those with idiopathic dystonia, who displayed preserved classification learning skills. Impairments seem to be independent from the age of onset of pathology. As far as we know, this study is the first to investigate motor and cognitive procedural learning in children with BG damage. Procedural impairments were documented whatever the aetiology of the BG damage/dysfunction and time of pathology onset, thus supporting the claim of very early skill learning development and lack of plasticity in case of damage. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Probabilistic assessment of landslide tsunami hazard for the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Pampell-Manis, A.; Horrillo, J.; Shigihara, Y.; Parambath, L.

    2016-01-01

    The devastating consequences of recent tsunamis affecting Indonesia and Japan have prompted a scientific response to better assess unexpected tsunami hazards. Although much uncertainty exists regarding the recurrence of large-scale tsunami events in the Gulf of Mexico (GoM), geological evidence indicates that a tsunami is possible and would most likely come from a submarine landslide triggered by an earthquake. This study customizes for the GoM a first-order probabilistic landslide tsunami hazard assessment. Monte Carlo Simulation (MCS) is employed to determine landslide configurations based on distributions obtained from observational submarine mass failure (SMF) data. Our MCS approach incorporates a Cholesky decomposition method for correlated landslide size parameters to capture correlations seen in the data as well as uncertainty inherent in these events. Slope stability analyses are performed using landslide and sediment properties and regional seismic loading to determine landslide configurations which fail and produce a tsunami. The probability of each tsunamigenic failure is calculated based on the joint probability of slope failure and probability of the triggering earthquake. We are thus able to estimate sizes and return periods for probabilistic maximum credible landslide scenarios. We find that the Cholesky decomposition approach generates landslide parameter distributions that retain the trends seen in observational data, improving the statistical validity and relevancy of the MCS technique in the context of landslide tsunami hazard assessment. Estimated return periods suggest that probabilistic maximum credible SMF events in the north and northwest GoM have a recurrence of 5000-8000 years, in agreement with age dates of observed deposits.

  13. Vaccine stabilization: research, commercialization, and potential impact.

    PubMed

    Kristensen, Debra; Chen, Dexiang; Cummings, Ray

    2011-09-22

    All vaccines are susceptible to damage by elevated temperatures and many are also damaged by freezing. The distribution, storage, and use of vaccines therefore present challenges that could be reduced by enhanced thermostability, with resulting improvements in vaccine effectiveness. Formulation and processing technologies exist that can improve the stability of vaccines at temperature extremes, however, customization is required for individual vaccines and results are variable. Considerations affecting decisions about stabilization approaches include development cost, manufacturing cost, and the ease of use of the final product. Public sector agencies can incentivize vaccine developers to prioritize stabilization efforts through advocacy and by implementing policies that increase demand for thermostable vaccines. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. USP7S-dependent inactivation of Mule regulates DNA damage signalling and repair.

    PubMed

    Khoronenkova, Svetlana V; Dianov, Grigory L

    2013-02-01

    The E3 ubiquitin ligase Mule/ARF-BP1 plays an important role in the cellular DNA damage response by controlling base excision repair and p53 protein levels. However, how the activity of Mule is regulated in response to DNA damage is currently unknown. Here, we report that the Ser18-containing isoform of the USP7 deubiquitylation enzyme (USP7S) controls Mule stability by preventing its self-ubiquitylation and subsequent proteasomal degradation. We find that in response to DNA damage, downregulation of USP7S leads to self-ubiquitylation and proteasomal degradation of Mule, which eventually leads to p53 accumulation. Cells that are unable to downregulate Mule show reduced ability to upregulate p53 levels in response to DNA damage. We also find that, as Mule inactivation is required for stabilization of base excision repair enzymes, the failure of cells to downregulate Mule after DNA damage results in deficient DNA repair. Our data describe a novel mechanism by which Mule is regulated in response to DNA damage and coordinates cellular DNA damage responses and DNA repair.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  16. Nek7 Protects Telomeres from Oxidative DNA Damage by Phosphorylation and Stabilization of TRF1.

    PubMed

    Tan, Rong; Nakajima, Satoshi; Wang, Qun; Sun, Hongxiang; Xue, Jing; Wu, Jian; Hellwig, Sabine; Zeng, Xuemei; Yates, Nathan A; Smithgall, Thomas E; Lei, Ming; Jiang, Yu; Levine, Arthur S; Su, Bing; Lan, Li

    2017-03-02

    Telomeric repeat binding factor 1 (TRF1) is essential to the maintenance of telomere chromatin structure and integrity. However, how telomere integrity is maintained, especially in response to damage, remains poorly understood. Here, we identify Nek7, a member of the Never in Mitosis Gene A (NIMA) kinase family, as a regulator of telomere integrity. Nek7 is recruited to telomeres and stabilizes TRF1 at telomeres after damage in an ATM activation-dependent manner. Nek7 deficiency leads to telomere aberrations, long-lasting γH2AX and 53BP1 foci, and augmented cell death upon oxidative telomeric DNA damage. Mechanistically, Nek7 interacts with and phosphorylates TRF1 on Ser114, which prevents TRF1 from binding to Fbx4, an Skp1-Cul1-F box E3 ligase subunit, thereby alleviating proteasomal degradation of TRF1, leading to a stable association of TRF1 with Tin2 to form a shelterin complex. Our data reveal a mechanism of efficient protection of telomeres from damage through Nek7-dependent stabilization of TRF1. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Application of Medium and Seasonal Flood Forecasts for Agriculture Damage Assessment

    NASA Astrophysics Data System (ADS)

    Fakhruddin, Shamsul; Ballio, Francesco; Menoni, Scira

    2015-04-01

    Early warning is a key element for disaster risk reduction. In recent decades, major advancements have been made in medium range and seasonal flood forecasting. This progress provides a great opportunity to reduce agriculture damage and improve advisories for early action and planning for flood hazards. This approach can facilitate proactive rather than reactive management of the adverse consequences of floods. In the agricultural sector, for instance, farmers can take a diversity of options such as changing cropping patterns, applying fertilizer, irrigating and changing planting timing. An experimental medium range (1-10 day) and seasonal (20-25 days) flood forecasting model has been developed for Thailand and Bangladesh. It provides 51 sets of discharge ensemble forecasts of 1-10 days with significant persistence and high certainty and qualitative outlooks for 20-25 days. This type of forecast could assist farmers and other stakeholders for differential preparedness activities. These ensembles probabilistic flood forecasts have been customized based on user-needs for community-level application focused on agriculture system. The vulnerabilities of agriculture system were calculated based on exposure, sensitivity and adaptive capacity. Indicators for risk and vulnerability assessment were conducted through community consultations. The forecast lead time requirement, user-needs, impacts and management options for crops were identified through focus group discussions, informal interviews and community surveys. This paper illustrates potential applications of such ensembles for probabilistic medium range and seasonal flood forecasts in a way that is not commonly practiced globally today.

  18. A Decision Support System for effective use of probability forecasts

    NASA Astrophysics Data System (ADS)

    De Kleermaeker, Simone; Verkade, Jan

    2013-04-01

    Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.

  19. A Simple Demonstration of Concrete Structural Health Monitoring Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less

  20. Probabilistic modeling of condition-based maintenance strategies and quantification of its benefits for airliners

    NASA Astrophysics Data System (ADS)

    Pattabhiraman, Sriram

    Airplane fuselage structures are designed with the concept of damage tolerance, wherein small damage are allowed to remain on the airplane, and damage that otherwise affect the safety of the structure are repaired. The damage critical to the safety of the fuselage are repaired by scheduling maintenance at pre-determined intervals. Scheduling maintenance is an interesting trade-off between damage tolerance and cost. Tolerance of larger damage would require less frequent maintenance and hence, a lower cost, to maintain a certain level of reliability. Alternatively, condition-based maintenance techniques have been developed using on-board sensors, which track damage continuously and request maintenance only when the damage size crosses a particular threshold. This effects a tolerance of larger damage than scheduled maintenance, leading to savings in cost. This work quantifies the savings of condition-based maintenance over scheduled maintenance. The work also quantifies converting the cost savings into weight savings. Structural health monitoring will need time to be able to establish itself as a stand-alone system for maintenance, due to concerns on its diagnosis accuracy and reliability. This work also investigates the effect of synchronizing structural health monitoring system with scheduled maintenance. This work uses on-board SHM equipment skip structural airframe maintenance (a subsect of scheduled maintenance), whenever deemed unnecessary while maintain a desired level of safety of structure. The work will also predict the necessary maintenance for a fleet of airplanes, based on the current damage status of the airplanes. The work also analyses the possibility of false alarm, wherein maintenance is being requested with no critical damage on the airplane. The work use SHM as a tool to identify lemons in a fleet of airplanes. Lemons are those airplanes that would warrant more maintenance trips than the average behavior of the fleet.

  1. Spin-transfer-torque efficiency enhanced by edge-damage of perpendicular magnetic random access memories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Kyungmi; Lee, Kyung-Jin, E-mail: kj-lee@korea.ac.kr; Department of Materials Science and Engineering, Korea University, Seoul 136-713

    2015-08-07

    We numerically investigate the effect of magnetic and electrical damages at the edge of a perpendicular magnetic random access memory (MRAM) cell on the spin-transfer-torque (STT) efficiency that is defined by the ratio of thermal stability factor to switching current. We find that the switching mode of an edge-damaged cell is different from that of an undamaged cell, which results in a sizable reduction in the switching current. Together with a marginal reduction of the thermal stability factor of an edge-damaged cell, this feature makes the STT efficiency large. Our results suggest that a precise edge control is viable formore » the optimization of STT-MRAM.« less

  2. Robust Control Design for Uncertain Nonlinear Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.

    2012-01-01

    Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.

  3. Low power lasers on genomic stability.

    PubMed

    Trajano, Larissa Alexsandra da Silva Neto; Sergio, Luiz Philippe da Silva; Stumbo, Ana Carolina; Mencalha, Andre Luiz; Fonseca, Adenilson de Souza da

    2018-03-01

    Exposure of cells to genotoxic agents causes modifications in DNA, resulting to alterations in the genome. To reduce genomic instability, cells have DNA damage responses in which DNA repair proteins remove these lesions. Excessive free radicals cause DNA damages, repaired by base excision repair and nucleotide excision repair pathways. When non-oxidative lesions occur, genomic stability is maintained through checkpoints in which the cell cycle stops and DNA repair occurs. Telomere shortening is related to the development of various diseases, such as cancer. Low power lasers are used for treatment of a number of diseases, but they are also suggested to cause DNA damages at sub-lethal levels and alter transcript levels from DNA repair genes. This review focuses on genomic and telomere stabilization modulation as possible targets to improve therapeutic protocols based on low power lasers. Several studies have been carried out to evaluate the laser-induced effects on genome and telomere stabilization suggesting that exposure to these lasers modulates DNA repair mechanisms, telomere maintenance and genomic stabilization. Although the mechanisms are not well understood yet, low power lasers could be effective against DNA harmful agents by induction of DNA repair mechanisms and modulation of telomere maintenance and genomic stability. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. A DESIGN METHOD FOR RETAINING WALL BASED ON RETURN PERIOD OF RAINFALL AND SNOWMELT

    NASA Astrophysics Data System (ADS)

    Ebana, Ryo; Uehira, Kenichiro; Yamada, Tadashi

    The main purpose of this study is to develop a new design method for the retaining wall in a cold district. In the cold district, snowfall and snowmelt is one of the main factors in sediment related disaster. However, the effect of the snowmelt is not being taken account of sediment disasters precaution and evacuation system. In this study, we target at past slope failure disaster and quantitatively evaluate that the effect of rainfall and snowmelt on groundwater level and then verify the stability of slope. Water supplied on the slope was determined from the probabilistic approach of the snowmelt using DegreeDay method in this study. Furthermore, a slope stability analysis was carried out based on the ground water level that was obtained from the unsaturated infiltration flow with the saturated seepage flow simulations. From the result of the slope stability analysis, it was found that the effect of ground water level on the stability of slope is much bigger than that of other factors.

  5. Computational modelling of the cerebral cortical microvasculature: effect of x-ray microbeams versus broad beam irradiation

    NASA Astrophysics Data System (ADS)

    Merrem, A.; Bartzsch, S.; Laissue, J.; Oelfke, U.

    2017-05-01

    Microbeam Radiation Therapy is an innovative pre-clinical strategy which uses arrays of parallel, tens of micrometres wide kilo-voltage photon beams to treat tumours. These x-ray beams are typically generated on a synchrotron source. It was shown that these beam geometries allow exceptional normal tissue sparing from radiation damage while still being effective in tumour ablation. A final biological explanation for this enhanced therapeutic ratio has still not been found, some experimental data support an important role of the vasculature. In this work, the effect of microbeams on a normal microvascular network of the cerebral cortex was assessed in computer simulations and compared to the effect of homogeneous, seamless exposures at equal energy absorption. The anatomy of a cerebral microvascular network and the inflicted radiation damage were simulated to closely mimic experimental data using a novel probabilistic model of radiation damage to blood vessels. It was found that the spatial dose fractionation by microbeam arrays significantly decreased the vascular damage. The higher the peak-to-valley dose ratio, the more pronounced the sparing effect. Simulations of the radiation damage as a function of morphological parameters of the vascular network demonstrated that the distribution of blood vessel radii is a key parameter determining both the overall radiation damage of the vasculature and the dose-dependent differential effect of microbeam irradiation.

  6. Games With Estimation of Non-Damage Objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canavan, G.H.

    1998-09-14

    Games against nature illustrate the role of non-damage objectives in producing conflict with uncertain rewards and the role of probing and estimation in reducing that uncertainty and restoring optimal strategies. This note discusses two essential elements of the analysis of crisis stability omitted from current treatments based on first strike stability: the role of an objective that motivates conflicts sufficiently serious to lead to conflicts, and the process of sequential interactions that could cause those conflicts to deepen. Games against nature illustrate role of objectives and uncertainty that are at the core of detailed treatments of crisis stability. These modelsmore » can also illustrate how these games processes can generate and deepen crises and the optimal strategies that might be used to end them. This note discusses two essential elements of the analysis of crisis stability that are omitted from current treatments based on first strike stability: anon-damage objective that motivates conflicts sufficiently serious to lead to conflicts, and the process of sequential tests that could cause those conflicts to deepen. The model used is a game against nature, simplified sufficiently to make the role of each of those elements obvious.« less

  7. Quantifying the risks of winter damage on overwintering crops under future climates: Will low-temperature damage be more likely in warmer climates?

    NASA Astrophysics Data System (ADS)

    Vico, G.; Weih, M.

    2014-12-01

    Autumn-sown crops act as winter cover crop, reducing soil erosion and nutrient leaching, while potentially providing higher yields than spring varieties in many environments. Nevertheless, overwintering crops are exposed for longer periods to the vagaries of weather conditions. Adverse winter conditions, in particular, may negatively affect the final yield, by reducing crop survival or its vigor. The net effect of the projected shifts in climate is unclear. On the one hand, warmer temperatures may reduce the frequency of low temperatures, thereby reducing damage risk. On the other hand, warmer temperatures, by reducing plant acclimation level and the amount and duration of snow cover, may increase the likelihood of damage. Thus, warmer climates may paradoxically result in more extensive low temperature damage and reduced viability for overwintering plants. The net effect of a shift in climate is explored by means of a parsimonious probabilistic model, based on a coupled description of air temperature, snow cover, and crop tolerable temperature. Exploiting an extensive dataset of winter wheat responses to low temperature exposure, the risk of winter damage occurrence is quantified under conditions typical of northern temperate latitudes. The full spectrum of variations expected with climate change is explored, quantifying the joint effects of alterations in temperature averages and their variability as well as shifts in precipitation. The key features affecting winter wheat vulnerability to low temperature damage under future climates are singled out.

  8. CFD Assessment of Aerodynamic Degradation of a Subsonic Transport Due to Airframe Damage

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Pirzadeh, Shahyar Z.; Atkins, Harold L.; Viken, Sally A.; Morrison, Joseph H.

    2010-01-01

    A computational study is presented to assess the utility of two NASA unstructured Navier-Stokes flow solvers for capturing the degradation in static stability and aerodynamic performance of a NASA General Transport Model (GTM) due to airframe damage. The approach is to correlate computational results with a substantial subset of experimental data for the GTM undergoing progressive losses to the wing, vertical tail, and horizontal tail components. The ultimate goal is to advance the probability of inserting computational data into the creation of advanced flight simulation models of damaged subsonic aircraft in order to improve pilot training. Results presented in this paper demonstrate good correlations with slope-derived quantities, such as pitch static margin and static directional stability, and incremental rolling moment due to wing damage. This study further demonstrates that high fidelity Navier-Stokes flow solvers could augment flight simulation models with additional aerodynamic data for various airframe damage scenarios.

  9. The numerical-statistical approach for hazard prediction of landslides and its application in Ukraine

    NASA Astrophysics Data System (ADS)

    Trofimchuk, O.; Kaliukh, Iu.

    2012-04-01

    More than 90% of the territory of Ukraine has complex ground conditions. Unpredictable changes of natural geological and man-made factors governing ground conditions, may lead to dangerous deformation processes resulting in accidents and disasters. Among them, landslides are the first by the amount of the inflicted damage in Ukraine and the second only to earthquakes in the world. Totally about 23 000 landslides were identified in the territory of Ukraine. The standard deterministic procedure of assessment of the slope stability, especially with the lack of reference engineering geological data, results in obtaining estimated values of stability coefficients differing from the real ones in many cases. Application of a probabilistic approach will allow to take into account the changeable properties of soils and to determine danger and risk of landslide dislocations. The matter of choice of landslide protection measures is directly connected with a risk: expensively but reliably or cheaper but with a great probability of accidents. The risk determines the consequences either economic, social or others, of a potential landslide dislocation on the slope both during construction of a retaining structure on it and in the process of its further maintenance. The quintessence of risk determination consists in the following: study and extrapolation of the past events for each specific occurrence. Expected conclusions and probable damages as a result of a calculated and accepted risk can be determined only with a certain level of uncertainty. Considering this fact improvement of the accuracy of numerical and analytical estimates when calculating the risk magnitude makes it possible to reduce the uncertainty. Calculations of the Chernivtsi shear landslides (Ukraine) were made with an application of Plaxis software and due account of a risk of its displacement was performed for the typical distribution diagram of the landslide-prone slope. The calculations showed that seismic events of intensity up to 6 points are able to significantly impair the characteristics of soil along the sliding surface and affect the slope stability and the value of landslide pressure onto supporting buildings. The further improvement of the site seismicity (up to 7-8 points) results in the substantial decrease of the stability coefficient. The slope, which firstly was stable, turns into a limiting equilibrium state, and then its state becomes unstable. Based on the calculation results, it is possible to follow step-by-step the process of stress redistribution in the landslide slope with its seismicity increase, which eventually causes the slope motion (unloading of accumulated stresses). Measures of landslide risk management are aimed at ensuring and maintaining acceptable or in some case an allowable risk level. In addition to calculations of soil masses stability, parameters of structures, creation of drawings and making of estimates, it is also necessary to take measures aimed at compensation of uncertainties and prevention of unforeseen situations due to incomplete (unreliable) surveys, etc. Depending on the initial situation or requirements the landslide risk management is to solve the following three problems: prevention of negative consequences; reduction of danger and risk; rectification of consequences, and prevention of a new danger development.

  10. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  11. Effects of stacking sequence on impact damage resistance and residual strength for quasi-isotropic laminates

    NASA Technical Reports Server (NTRS)

    Dost, Ernest F.; Ilcewicz, Larry B.; Avery, William B.; Coxon, Brian R.

    1991-01-01

    Residual strength of an impacted composite laminate is dependent on details of the damage state. Stacking sequence was varied to judge its effect on damage caused by low-velocity impact. This was done for quasi-isotropic layups of a toughened composite material. Experimental observations on changes in the impact damage state and postimpact compressive performance were presented for seven different laminate stacking sequences. The applicability and limitations of analysis compared to experimental results were also discussed. Postimpact compressive behavior was found to be a strong function of the laminate stacking sequence. This relationship was found to depend on thickness, stacking sequence, size, and location of sublaminates that comprise the impact damage state. The postimpact strength for specimens with a relatively symmetric distribution of damage through the laminate thickness was accurately predicted by models that accounted for sublaminate stability and in-plane stress redistribution. An asymmetric distribution of damage in some laminate stacking sequences tended to alter specimen stability. Geometrically nonlinear finite element analysis was used to predict this behavior.

  12. Divergence instability of pipes conveying fluid with uncertain flow velocity

    NASA Astrophysics Data System (ADS)

    Rahmati, Mehdi; Mirdamadi, Hamid Reza; Goli, Sareh

    2018-02-01

    This article deals with investigation of probabilistic stability of pipes conveying fluid with stochastic flow velocity in time domain. As a matter of fact, this study has focused on the randomness effects of flow velocity on stability of pipes conveying fluid while most of research efforts have only focused on the influences of deterministic parameters on the system stability. The Euler-Bernoulli beam and plug flow theory are employed to model pipe structure and internal flow, respectively. In addition, flow velocity is considered as a stationary random process with Gaussian distribution. Afterwards, the stochastic averaging method and Routh's stability criterion are used so as to investigate the stability conditions of system. Consequently, the effects of boundary conditions, viscoelastic damping, mass ratio, and elastic foundation on the stability regions are discussed. Results delineate that the critical mean flow velocity decreases by increasing power spectral density (PSD) of the random velocity. Moreover, by increasing PSD from zero, the type effects of boundary condition and presence of elastic foundation are diminished, while the influences of viscoelastic damping and mass ratio could increase. Finally, to have a more applicable study, regression analysis is utilized to develop design equations and facilitate further analyses for design purposes.

  13. Bounding the first exit from the basin: Independence times and finite-time basin stability

    NASA Astrophysics Data System (ADS)

    Schultz, Paul; Hellmann, Frank; Webster, Kevin N.; Kurths, Jürgen

    2018-04-01

    We study the stability of deterministic systems, given sequences of large, jump-like perturbations. Our main result is the derivation of a lower bound for the probability of the system to remain in the basin, given that perturbations are rare enough. This bound is efficient to evaluate numerically. To quantify rare enough, we define the notion of the independence time of such a system. This is the time after which a perturbed state has probably returned close to the attractor, meaning that subsequent perturbations can be considered separately. The effect of jump-like perturbations that occur at least the independence time apart is thus well described by a fixed probability to exit the basin at each jump, allowing us to obtain the bound. To determine the independence time, we introduce the concept of finite-time basin stability, which corresponds to the probability that a perturbed trajectory returns to an attractor within a given time. The independence time can then be determined as the time scale at which the finite-time basin stability reaches its asymptotic value. Besides that, finite-time basin stability is a novel probabilistic stability measure on its own, with potential broad applications in complex systems.

  14. Long Non-coding RNA, PANDA, Contributes to the Stabilization of p53 Tumor Suppressor Protein.

    PubMed

    Kotake, Yojiro; Kitagawa, Kyoko; Ohhata, Tatsuya; Sakai, Satoshi; Uchida, Chiharu; Niida, Hiroyuki; Naemura, Madoka; Kitagawa, Masatoshi

    2016-04-01

    P21-associated noncoding RNA DNA damage-activated (PANDA) is induced in response to DNA damage and represses apoptosis by inhibiting the function of nuclear transcription factor Y subunit alpha (NF-YA) transcription factor. Herein, we report that PANDA affects regulation of p53 tumor-suppressor protein. U2OS cells were transfected with PANDA siRNAs. At 72 h post-transfection, cells were subjected to immunoblotting and quantitative reverse transcription-polymerase chain reaction. Depletion of PANDA was associated with decreased levels of p53 protein, but not p53 mRNA. The stability of p53 protein was markedly reduced by PANDA silencing. Degradation of p53 protein by silencing PANDA was prevented by treatment of MG132, a proteasome inhibitor. Moreover, depletion of PANDA prevented accumulation of p53 protein, as a result of DNA damage, induced by the genotoxic agent etoposide. These results suggest that PANDA stabilizes p53 protein in response to DNA damage, and provide new insight into the regulatory mechanisms of p53. Copyright© 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  15. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once calibrated, the model was able to hindcast the damage produced in Santa Cruz Harbor during the 2010 Chile and 2011 Japan events. Results of the Santa Cruz analysis will be presented and discussed.

  16. Effect of Carbon-Cycle Uncertainty on Estimates of the 1.5oC Carbon Budget

    NASA Astrophysics Data System (ADS)

    Mengis, N.; Jalbert, J.; Partanen, A. I.; Matthews, D.

    2017-12-01

    In December 2015, the participants of the COP21 agreed to pursue efforts to limit global temperature increase to 1.5oC relative to the preindustrial level. A robust estimate of the carbon budget for this temperature target is one precondition for well-informed political discussions. These estimates, however, depend on Earth system models and need to account for model inherent uncertainties. Here, we quantify the effect of carbon cycle uncertainty within an intermediate complexity Earth system model. Using an Bayesian inversion approach we obtain a probabilistic estimate for the 1.5oC carbon budget of 66 PgC with a range of 20 to 112 PgC. This estimate is in good agreement with the IPCC's estimate, and additionally provides a probabilistic range accounting for uncertainties in the natural carbon sinks. Furthermore our results suggest, that for a long-term temperature stabilization at 1.5oC, negative fossil fuel emissions in the order of 1 PgC yr-1 would be needed. Two effects cause the fossil fuel emissions during temperature stabilization to turn negative: 1) The reduced uptake potential of the natural carbon sinks, which arises from increasing ocean temperatures, and the fact that the land turns from a net carbon sink to a source. 2) The residual positive anthropogenic forcing in the extended scenario, which remains as high as 2.5 W m-2, until the end of 2200. In contrast to previous studies our results suggest the need for negative fossil fuel emissions for a long term temperature stabilization to compensate for residual anthropogenic forcing and a decreasing natural carbon sink potential.

  17. Diseases Associated with Defective Responses to DNA Damage

    PubMed Central

    O’Driscoll, Mark

    2012-01-01

    Within the last decade, multiple novel congenital human disorders have been described with genetic defects in known and/or novel components of several well-known DNA repair and damage response pathways. Examples include disorders of impaired nucleotide excision repair, DNA double-strand and single-strand break repair, as well as compromised DNA damage-induced signal transduction including phosphorylation and ubiquitination. These conditions further reinforce the importance of multiple genome stability pathways for health and development in humans. Furthermore, these conditions inform our knowledge of the biology of the mechanics of genome stability and in some cases provide potential routes to help exploit these pathways therapeutically. Here, I will review a selection of these exciting findings from the perspective of the disorders themselves, describing how they were identified, how genotype informs phenotype, and how these defects contribute to our growing understanding of genome stability pathways. PMID:23209155

  18. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  19. DNA-damage response during mitosis induces whole-chromosome missegregation.

    PubMed

    Bakhoum, Samuel F; Kabeche, Lilian; Murnane, John P; Zaki, Bassem I; Compton, Duane A

    2014-11-01

    Many cancers display both structural (s-CIN) and numerical (w-CIN) chromosomal instabilities. Defective chromosome segregation during mitosis has been shown to cause DNA damage that induces structural rearrangements of chromosomes (s-CIN). In contrast, whether DNA damage can disrupt mitotic processes to generate whole chromosomal instability (w-CIN) is unknown. Here, we show that activation of the DNA-damage response (DDR) during mitosis selectively stabilizes kinetochore-microtubule (k-MT) attachments to chromosomes through Aurora-A and PLK1 kinases, thereby increasing the frequency of lagging chromosomes during anaphase. Inhibition of DDR proteins, ATM or CHK2, abolishes the effect of DNA damage on k-MTs and chromosome segregation, whereas activation of the DDR in the absence of DNA damage is sufficient to induce chromosome segregation errors. Finally, inhibiting the DDR during mitosis in cancer cells with persistent DNA damage suppresses inherent chromosome segregation defects. Thus, the DDR during mitosis inappropriately stabilizes k-MTs, creating a link between s-CIN and w-CIN. The genome-protective role of the DDR depends on its ability to delay cell division until damaged DNA can be fully repaired. Here, we show that when DNA damage is induced during mitosis, the DDR unexpectedly induces errors in the segregation of entire chromosomes, thus linking structural and numerical chromosomal instabilities. ©2014 American Association for Cancer Research.

  20. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less

  1. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE PAGES

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...

    2017-01-24

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  2. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  3. Probabilistic failure analysis of bone using a finite element model of mineral-collagen composites.

    PubMed

    Dong, X Neil; Guda, Teja; Millwater, Harry R; Wang, Xiaodu

    2009-02-09

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures.

  4. Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites

    PubMed Central

    Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu

    2009-01-01

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures. PMID:19058806

  5. Probabilistic characterization of wind turbine blades via aeroelasticity and spinning finite element formulation

    NASA Astrophysics Data System (ADS)

    Velazquez, Antonio; Swartz, R. Andrew

    2012-04-01

    Wind energy is an increasingly important component of this nation's renewable energy portfolio, however safe and economical wind turbine operation is a critical need to ensure continued adoption. Safe operation of wind turbine structures requires not only information regarding their condition, but their operational environment. Given the difficulty inherent in SHM processes for wind turbines (damage detection, location, and characterization), some uncertainty in conditional assessment is expected. Furthermore, given the stochastic nature of the loading on turbine structures, a probabilistic framework is appropriate to characterize their risk of failure at a given time. Such information will be invaluable to turbine controllers, allowing them to operate the structures within acceptable risk profiles. This study explores the characterization of the turbine loading and response envelopes for critical failure modes of the turbine blade structures. A framework is presented to develop an analytical estimation of the loading environment (including loading effects) based on the dynamic behavior of the blades. This is influenced by behaviors including along and across-wind aero-elastic effects, wind shear gradient, tower shadow effects, and centrifugal stiffening effects. The proposed solution includes methods that are based on modal decomposition of the blades and require frequent updates to the estimated modal properties to account for the time-varying nature of the turbine and its environment. The estimated demand statistics are compared to a code-based resistance curve to determine a probabilistic estimate of the risk of blade failure given the loading environment.

  6. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of amore » power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.« less

  7. Numerical modeling of the load effect on PZT-induced guided wave for load compensation of damage detection

    NASA Astrophysics Data System (ADS)

    Sun, Hu; Zhang, Aijia; Wang, Yishou; Qing, Xinlin P.

    2017-04-01

    Guided wave-based structural health monitoring (SHM) has been given considerable attention and widely studied for large-scale aircraft structures. Nevertheless, it is difficult to apply SHM systems on board or online, for which one of the most serious reasons is the environmental influence. Load is one fact that affects not only the host structure, in which guided wave propagates, but also the PZT, by which guided wave is transmitted and received. In this paper, numerical analysis using finite element method is used to study the load effect on guided wave acquired by PZT. The static loads with different grades are considered to analyze its effect on guided wave signals that PZT transmits and receives. Based on the variation trend of guided waves versus load, a load compensation method is developed to eliminate effects of load in the process of damage detection. The probabilistic reconstruction algorithm based on the signal variation of transmitter-receiver path is employed to identify the damage. Numerical tests is conducted to verify the feasibility and effectiveness of the given method.

  8. Ho Chi Minh City adaptation to increasing risk of coastal and fluvial floods

    NASA Astrophysics Data System (ADS)

    Scussolini, Paolo; Lasage, Ralph

    2016-04-01

    Coastal megacities in southeast Asia are a hotspot of vulnerability to floods. In such contexts, the combination of fast socio-economic development and of climate change impacts on precipitation and sea level generates concerns about the flood damage to people and assets. This work focuses on Ho Chi Minh City, Vietnam, for which we estimate the present and future direct risk from river and coastal floods. A model cascade is used that comprises the Saigon river basin and the urban network, plus the land-use-dependent damaging process. Changes in discharge for five return periods are simulated, enabling the probabilistic calculation of the expected annual economic damage to assets, for differnt scenarios of global emissions, local socio-economic growth, and land subsidence, up to year 2100. The implementation of a range of adaptation strategies is simulated, including building dykes, elevating, creating reservoirs, managing water and sediment upstream, flood-proofing, halting groundwater abstraction. Results are presented on 1) the relative weight of each future driver in determining the flood risk of Ho Chi Minh, and 2) the efficiency and feasibility of each adaptation strategy.

  9. GUI to Facilitate Research on Biological Damage from Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, Frances A.; Ponomarev, Artem Lvovich

    2010-01-01

    A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.

  10. Modeling, Analysis, and Control of Swarming Agents in a Probabilistic Framework

    DTIC Science & Technology

    2012-11-01

    configurations, which can ultimately lead the swarm towards configurations close to the global minimum of the total potential of interactions. The drawback ...165–171, 1992. [6] H. Ye, H. Wang, and H. Wang, “Stabilization of a PVTOL aircraft and an inertia wheel pendulum using saturation technique,” IEEE...estimate its parameters. The drawback of this approach is that the assumed form of the field can be unrealistic. In the approach that we are presenting here

  11. Future Directions and Challenges in Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann

    1998-01-01

    An answer is sought to the question of today, in 1997, after so many years of concentrated research effort in designing buckling critical thin walled shells, why one cannot do any better than using the rather conservative Lower Bound Design Philosophy of the sixties. It will be shown that with the establishment of Initial Imperfection Data Banks and the introduction of Probabilistic Design Procedures one has a viable alternative, that when used judiciously, may lead to improved shell design recommendations.

  12. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  13. Life Modeling and Design Analysis for Ceramic Matrix Composite Materials

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The primary research efforts focused on characterizing and modeling static failure, environmental durability, and creep-rupture behavior of two classes of ceramic matrix composites (CMC), silicon carbide fibers in a silicon carbide matrix (SiC/SiC) and carbon fibers in a silicon carbide matrix (C/SiC). An engineering life prediction model (Probabilistic Residual Strength model) has been developed specifically for CMCs. The model uses residual strength as the damage metric for evaluating remaining life and is posed probabilistically in order to account for the stochastic nature of the material s response. In support of the modeling effort, extensive testing of C/SiC in partial pressures of oxygen has been performed. This includes creep testing, tensile testing, half life and residual tensile strength testing. C/SiC is proposed for airframe and propulsion applications in advanced reusable launch vehicles. Figures 1 and 2 illustrate the models predictive capabilities as well as the manner in which experimental tests are being selected in such a manner as to ensure sufficient data is available to aid in model validation.

  14. Verification of Small Hole Theory for Application to Wire Chaffing Resulting in Shield Faults

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2011-01-01

    Our work is focused upon developing methods for wire chafe fault detection through the use of reflectometry to assess shield integrity. When shielded electrical aircraft wiring first begins to chafe typically the resulting evidence is small hole(s) in the shielding. We are focused upon developing algorithms and the signal processing necessary to first detect these small holes prior to incurring damage to the inner conductors. Our approach has been to develop a first principles physics model combined with probabilistic inference, and to verify this model with laboratory experiments as well as through simulation. Previously we have presented the electromagnetic small-hole theory and how it might be applied to coaxial cable. In this presentation, we present our efforts to verify this theoretical approach with high-fidelity electromagnetic simulations (COMSOL). Laboratory observations are used to parameterize the computationally efficient theoretical model with probabilistic inference resulting in quantification of hole size and location. Our efforts in characterizing faults in coaxial cable are subsequently leading to fault detection in shielded twisted pair as well as analysis of intermittent faulty connectors using similar techniques.

  15. Probabilistic Multi-Hazard Assessment of Dry Cask Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bencturk, Bora; Padgett, Jamie; Uddin, Rizwan

    systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environmentsmore » are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.« less

  16. Chromatin Dynamics in Genome Stability: Roles in Suppressing Endogenous DNA Damage and Facilitating DNA Repair

    PubMed Central

    Nair, Nidhi; Shoaib, Muhammad

    2017-01-01

    Genomic DNA is compacted into chromatin through packaging with histone and non-histone proteins. Importantly, DNA accessibility is dynamically regulated to ensure genome stability. This is exemplified in the response to DNA damage where chromatin relaxation near genomic lesions serves to promote access of relevant enzymes to specific DNA regions for signaling and repair. Furthermore, recent data highlight genome maintenance roles of chromatin through the regulation of endogenous DNA-templated processes including transcription and replication. Here, we review research that shows the importance of chromatin structure regulation in maintaining genome integrity by multiple mechanisms including facilitating DNA repair and directly suppressing endogenous DNA damage. PMID:28698521

  17. 46 CFR 174.065 - Damage stability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... lowest edge of any opening through which additional flooding could occur if the unit were subjected simultaneously to— (1) Damage causing flooding described in §§ 174.075 through 174.085; and (2) A wind heeling...

  18. 46 CFR 174.065 - Damage stability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... lowest edge of any opening through which additional flooding could occur if the unit were subjected simultaneously to— (1) Damage causing flooding described in §§ 174.075 through 174.085; and (2) A wind heeling...

  19. 46 CFR 174.065 - Damage stability requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... lowest edge of any opening through which additional flooding could occur if the unit were subjected simultaneously to— (1) Damage causing flooding described in §§ 174.075 through 174.085; and (2) A wind heeling...

  20. Downregulation of VRK1 by p53 in Response to DNA Damage Is Mediated by the Autophagic Pathway

    PubMed Central

    Valbuena, Alberto; Castro-Obregón, Susana; Lazo, Pedro A.

    2011-01-01

    Human VRK1 induces a stabilization and accumulation of p53 by specific phosphorylation in Thr18. This p53 accumulation is reversed by its downregulation mediated by Hdm2, requiring a dephosphorylated p53 and therefore also needs the removal of VRK1 as stabilizer. This process requires export of VRK1 to the cytosol and is inhibited by leptomycin B. We have identified that downregulation of VRK1 protein levels requires DRAM expression, a p53-induced gene. DRAM is located in the endosomal-lysosomal compartment. Induction of DNA damage by UV, IR, etoposide and doxorubicin stabilizes p53 and induces DRAM expression, followed by VRK1 downregulation and a reduction in p53 Thr18 phosphorylation. DRAM expression is induced by wild-type p53, but not by common human p53 mutants, R175H, R248W and R273H. Overexpression of DRAM induces VRK1 downregulation and the opposite effect was observed by its knockdown. LC3 and p62 were also downregulated, like VRK1, in response to UV-induced DNA damage. The implication of the autophagic pathway was confirmed by its requirement for Beclin1. We propose a model with a double regulatory loop in response to DNA damage, the accumulated p53 is removed by induction of Hdm2 and degradation in the proteasome, and the p53-stabilizer VRK1 is eliminated by the induction of DRAM that leads to its lysosomal degradation in the autophagic pathway, and thus permitting p53 degradation by Hdm2. This VRK1 downregulation is necessary to modulate the block in cell cycle progression induced by p53 as part of its DNA damage response. PMID:21386980

  1. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

  2. A framework for probabilistic pluvial flood nowcasting for urban areas

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the larger city of Gent, Belgium. After each of the different above-mentioned components were evaluated, they were combined and tested for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different methods for visualization of the probabilistic inundation results are proposed and tested.

  3. Disruption of the nucleolus mediates stabilization of p53 in response to DNA damage and other stresses

    PubMed Central

    Rubbi, Carlos P.; Milner, Jo

    2003-01-01

    p53 protects against cancer through its capacity to induce cell cycle arrest or apoptosis under a large variety of cellular stresses. It is not known how such diversity of signals can be integrated by a single molecule. However, the literature reveals that a common denominator in all p53-inducing stresses is nucleolar disruption. We thus postulated that the impairment of nucleolar function might stabilize p53 by preventing its degradation. Using micropore irradiation, we demonstrate that large amounts of nuclear DNA damage fail to stabilize p53 unless the nucleolus is also disrupted. Forcing nucleolar disruption by anti-upstream binding factor (UBF) microinjection (in the absence of DNA damage) also causes p53 stabilization. We propose that the nucleolus is a stress sensor responsible for maintenance of low levels of p53, which are automatically elevated as soon as nucleolar function is impaired in response to stress. Our model integrates all known p53-inducing agents and also explains cell cycle-related variations in p53 levels which correlate with established phases of nucleolar assembly/disassembly through the cell cycle. PMID:14609953

  4. Parametric study of irradiation effects on the ductile damage and flow stress behavior in ferritic-martensitic steels

    NASA Astrophysics Data System (ADS)

    Chakraborty, Pritam; Biner, S. Bulent

    2015-10-01

    Ferritic-martensitic steels are currently being considered as structural materials in fusion and Gen-IV nuclear reactors. These materials are expected to experience high dose radiation, which can increase their ductile to brittle transition temperature and susceptibility to failure during operation. Hence, to estimate the safe operational life of the reactors, precise evaluation of the ductile to brittle transition temperatures of ferritic-martensitic steels is necessary. Owing to the scarcity of irradiated samples, particularly at high dose levels, micro-mechanistic models are being employed to predict the shifts in the ductile to brittle transition temperatures. These models consider the ductile damage evolution, in the form of nucleation, growth and coalescence of voids; and the brittle fracture, in the form of probabilistic cleavage initiation, to estimate the influence of irradiation on the ductile to brittle transition temperature. However, the assessment of irradiation dependent material parameters is challenging and influences the accuracy of these models. In the present study, the effects of irradiation on the overall flow stress and ductile damage behavior of two ferritic-martensitic steels is parametrically investigated. The results indicate that the ductile damage model parameters are mostly insensitive to irradiation levels at higher dose levels though the resulting flow stress behavior varies significantly.

  5. Damage instability and Earthquake nucleation

    NASA Astrophysics Data System (ADS)

    Ionescu, I. R.; Gomez, Q.; Campillo, M.; Jia, X.

    2017-12-01

    Earthquake nucleation (initiation) is usually associated to the loss of the stability of the geological structure under a slip-weakening friction acting on the fault. The key parameters involved in the stability of the fault are the stress drop, the critical slip distance but also the elastic stiffness of the surrounding materials (rocks). We want to explore here how the nucleation phenomena are correlated to the material softening during damage accumulation by dynamic and/or quasi-static processes. Since damage models are describing micro-cracks growth, which is generally an unstable phenomenon, it is natural to expect some loss of stability on the associated micro-mechanics based models. If the model accurately captures the material behavior, then this can be due to the unstable nature of the brittle material itself. We obtained stability criteria at the microscopic scale, which are related to a large class of damage models. We show that for a given continuous strain history the quasi-static or dynamic problems are instable or ill-posed (multiplicity of material responses) and whatever the selection rule is adopted, shocks (time discontinuities) will occur. We show that the quasi-static equilibria chosen by the "perfect delay convention" is always stable. These stability criteria are used to analyze how NIC (Non Interacting Crack) effective elasticity associated to "self similar growth" model work in some special configurations (one family of micro-cracks in mode I, II and III and in plane strain or plain stress). In each case we determine a critical crack density parameter and critical micro-crack radius (length) which distinguish between stable and unstable behaviors. This critical crack density depends only on the chosen configuration and on the Poisson ratio.

  6. Novel probabilistic and distributed algorithms for guidance, control, and nonlinear estimation of large-scale multi-agent systems

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Saptarshi

    Multi-agent systems are widely used for constructing a desired formation shape, exploring an area, surveillance, coverage, and other cooperative tasks. This dissertation introduces novel algorithms in the three main areas of shape formation, distributed estimation, and attitude control of large-scale multi-agent systems. In the first part of this dissertation, we address the problem of shape formation for thousands to millions of agents. Here, we present two novel algorithms for guiding a large-scale swarm of robotic systems into a desired formation shape in a distributed and scalable manner. These probabilistic swarm guidance algorithms adopt an Eulerian framework, where the physical space is partitioned into bins and the swarm's density distribution over each bin is controlled using tunable Markov chains. In the first algorithm - Probabilistic Swarm Guidance using Inhomogeneous Markov Chains (PSG-IMC) - each agent determines its bin transition probabilities using a time-inhomogeneous Markov chain that is constructed in real-time using feedback from the current swarm distribution. This PSG-IMC algorithm minimizes the expected cost of the transitions required to achieve and maintain the desired formation shape, even when agents are added to or removed from the swarm. The algorithm scales well with a large number of agents and complex formation shapes, and can also be adapted for area exploration applications. In the second algorithm - Probabilistic Swarm Guidance using Optimal Transport (PSG-OT) - each agent determines its bin transition probabilities by solving an optimal transport problem, which is recast as a linear program. In the presence of perfect feedback of the current swarm distribution, this algorithm minimizes the given cost function, guarantees faster convergence, reduces the number of transitions for achieving the desired formation, and is robust to disturbances or damages to the formation. We demonstrate the effectiveness of these two proposed swarm guidance algorithms using results from numerical simulations and closed-loop hardware experiments on multiple quadrotors. In the second part of this dissertation, we present two novel discrete-time algorithms for distributed estimation, which track a single target using a network of heterogeneous sensing agents. The Distributed Bayesian Filtering (DBF) algorithm, the sensing agents combine their normalized likelihood functions using the logarithmic opinion pool and the discrete-time dynamic average consensus algorithm. Each agent's estimated likelihood function converges to an error ball centered on the joint likelihood function of the centralized multi-sensor Bayesian filtering algorithm. Using a new proof technique, the convergence, stability, and robustness properties of the DBF algorithm are rigorously characterized. The explicit bounds on the time step of the robust DBF algorithm are shown to depend on the time-scale of the target dynamics. Furthermore, the DBF algorithm for linear-Gaussian models can be cast into a modified form of the Kalman information filter. In the Bayesian Consensus Filtering (BCF) algorithm, the agents combine their estimated posterior pdfs multiple times within each time step using the logarithmic opinion pool scheme. Thus, each agent's consensual pdf minimizes the sum of Kullback-Leibler divergences with the local posterior pdfs. The performance and robust properties of these algorithms are validated using numerical simulations. In the third part of this dissertation, we present an attitude control strategy and a new nonlinear tracking controller for a spacecraft carrying a large object, such as an asteroid or a boulder. If the captured object is larger or comparable in size to the spacecraft and has significant modeling uncertainties, conventional nonlinear control laws that use exact feed-forward cancellation are not suitable because they exhibit a large resultant disturbance torque. The proposed nonlinear tracking control law guarantees global exponential convergence of tracking errors with finite-gain Lp stability in the presence of modeling uncertainties and disturbances, and reduces the resultant disturbance torque. Further, this control law permits the use of any attitude representation and its integral control formulation eliminates any constant disturbance. Under small uncertainties, the best strategy for stabilizing the combined system is to track a fuel-optimal reference trajectory using this nonlinear control law, because it consumes the least amount of fuel. In the presence of large uncertainties, the most effective strategy is to track the derivative plus proportional-derivative based reference trajectory, because it reduces the resultant disturbance torque. The effectiveness of the proposed attitude control law is demonstrated by using results of numerical simulation based on an Asteroid Redirect Mission concept. The new algorithms proposed in this dissertation will facilitate the development of versatile autonomous multi-agent systems that are capable of performing a variety of complex tasks in a robust and scalable manner.

  7. Probabilistic Integrated Assessment of ``Dangerous'' Climate Change

    NASA Astrophysics Data System (ADS)

    Mastrandrea, Michael D.; Schneider, Stephen H.

    2004-04-01

    Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.

  8. Seismic Evaluation of A Historical Structure In Kastamonu - Turkey

    NASA Astrophysics Data System (ADS)

    Pınar, USTA; Işıl ÇARHOĞLU, Asuman; EVCİ, Ahmet

    2018-01-01

    The Kastomonu province is a seismically active zone. the city has many historical buildings made of stone-masonry. In case of any probable future earthquakes, existing buildings may suffer substantial or heavy damages. In the present study, one of the historical traditional house located in Kastamonu were structurally investigated through probabilistic seismic risk assessment methodology. In the study, the building was modeled by using the Finite Element Modeling (FEM) software, SAP2000. Time history analyses were carried out using 10 different ground motion data on the FEM models. Displacements were interpreted, and the results were displayed graphically and discussed.

  9. Negative Selection Algorithm for Aircraft Fault Detection

    NASA Technical Reports Server (NTRS)

    Dasgupta, D.; KrishnaKumar, K.; Wong, D.; Berry, M.

    2004-01-01

    We investigated a real-valued Negative Selection Algorithm (NSA) for fault detection in man-in-the-loop aircraft operation. The detection algorithm uses body-axes angular rate sensory data exhibiting the normal flight behavior patterns, to generate probabilistically a set of fault detectors that can detect any abnormalities (including faults and damages) in the behavior pattern of the aircraft flight. We performed experiments with datasets (collected under normal and various simulated failure conditions) using the NASA Ames man-in-the-loop high-fidelity C-17 flight simulator. The paper provides results of experiments with different datasets representing various failure conditions.

  10. Safe Life Propulsion Design Technologies (3rd Generation Propulsion Research and Technology)

    NASA Technical Reports Server (NTRS)

    Ellis, Rod

    2000-01-01

    The tasks outlined in this viewgraph presentation on safe life propulsion design technologies (third generation propulsion research and technology) include the following: (1) Ceramic matrix composite (CMC) life prediction methods; (2) Life prediction methods for ultra high temperature polymer matrix composites for reusable launch vehicle (RLV) airframe and engine application; (3) Enabling design and life prediction technology for cost effective large-scale utilization of MMCs and innovative metallic material concepts; (4) Probabilistic analysis methods for brittle materials and structures; (5) Damage assessment in CMC propulsion components using nondestructive characterization techniques; and (6) High temperature structural seals for RLV applications.

  11. Residual thermal stresses in composites for dimensionally stable spacecraft applications

    NASA Technical Reports Server (NTRS)

    Bowles, David E.; Tompkins, Stephen S.; Funk, Joan G.

    1992-01-01

    An overview of NASA LaRC's research on thermal residual stresses and their effect on the dimensional stability of carbon fiber reinforced polymer-matrix composites is presented. The data show that thermal residual stresses can induce damage in polymer matrix composites and significantly affect the dimensional stability of these composites by causing permanent residual strains and changes in CTE. The magnitude of these stresses is primarily controlled by the laminate configuration and the applied temperature change. The damage caused by thermal residual stresses initiates at the fiber/matrix interface and micromechanics level analyses are needed to accurately predict it. An increased understanding of fiber/matrix interface interactions appears to be the best approach for improving a composite's resistance to thermally induced damage.

  12. Neural Stability, Sparing, and Behavioral Recovery Following Brain Damage

    ERIC Educational Resources Information Center

    LeVere, T. E.

    1975-01-01

    The present article discusses the possibility that behavioral recovery following brain damage is not dependent on the functional reorganization of neural tissue but is rather the result of the continued normal operation of spared neural mechanisms. (Editor)

  13. Stochastic Routing and Scheduling Policies for Energy Harvesting Communication Networks

    NASA Astrophysics Data System (ADS)

    Calvo-Fullana, Miguel; Anton-Haro, Carles; Matamoros, Javier; Ribeiro, Alejandro

    2018-07-01

    In this paper, we study the joint routing-scheduling problem in energy harvesting communication networks. Our policies, which are based on stochastic subgradient methods on the dual domain, act as an energy harvesting variant of the stochastic family of backpresure algorithms. Specifically, we propose two policies: (i) the Stochastic Backpressure with Energy Harvesting (SBP-EH), in which a node's routing-scheduling decisions are determined by the difference between the Lagrange multipliers associated to their queue stability constraints and their neighbors'; and (ii) the Stochastic Soft Backpressure with Energy Harvesting (SSBP-EH), an improved algorithm where the routing-scheduling decision is of a probabilistic nature. For both policies, we show that given sustainable data and energy arrival rates, the stability of the data queues over all network nodes is guaranteed. Numerical results corroborate the stability guarantees and illustrate the minimal gap in performance that our policies offer with respect to classical ones which work with an unlimited energy supply.

  14. Homeostatic Agent for General Environment

    NASA Astrophysics Data System (ADS)

    Yoshida, Naoto

    2018-03-01

    One of the essential aspect in biological agents is dynamic stability. This aspect, called homeostasis, is widely discussed in ethology, neuroscience and during the early stages of artificial intelligence. Ashby's homeostats are general-purpose learning machines for stabilizing essential variables of the agent in the face of general environments. However, despite their generality, the original homeostats couldn't be scaled because they searched their parameters randomly. In this paper, first we re-define the objective of homeostats as the maximization of a multi-step survival probability from the view point of sequential decision theory and probabilistic theory. Then we show that this optimization problem can be treated by using reinforcement learning algorithms with special agent architectures and theoretically-derived intrinsic reward functions. Finally we empirically demonstrate that agents with our architecture automatically learn to survive in a given environment, including environments with visual stimuli. Our survival agents can learn to eat food, avoid poison and stabilize essential variables through theoretically-derived single intrinsic reward formulations.

  15. Percolation and Reinforcement on Complex Networks

    NASA Astrophysics Data System (ADS)

    Yuan, Xin

    Complex networks appear in almost every aspect of our daily life and are widely studied in the fields of physics, mathematics, finance, biology and computer science. This work utilizes percolation theory in statistical physics to explore the percolation properties of complex networks and develops a reinforcement scheme on improving network resilience. This dissertation covers two major parts of my Ph.D. research on complex networks: i) probe--in the context of both traditional percolation and k-core percolation--the resilience of complex networks with tunable degree distributions or directed dependency links under random, localized or targeted attacks; ii) develop and propose a reinforcement scheme to eradicate catastrophic collapses that occur very often in interdependent networks. We first use generating function and probabilistic methods to obtain analytical solutions to percolation properties of interest, such as the giant component size and the critical occupation probability. We study uncorrelated random networks with Poisson, bi-Poisson, power-law, and Kronecker-delta degree distributions and construct those networks which are based on the configuration model. The computer simulation results show remarkable agreement with theoretical predictions. We discover an increase of network robustness as the degree distribution broadens and a decrease of network robustness as directed dependency links come into play under random attacks. We also find that targeted attacks exert the biggest damage to the structure of both single and interdependent networks in k-core percolation. To strengthen the resilience of interdependent networks, we develop and propose a reinforcement strategy and obtain the critical amount of reinforced nodes analytically for interdependent Erdḧs-Renyi networks and numerically for scale-free and for random regular networks. Our mechanism leads to improvement of network stability of the West U.S. power grid. This dissertation provides us with a deeper understanding of the effects of structural features on network stability and fresher insights into designing resilient interdependent infrastructure networks.

  16. Genomic stability and telomere regulation in skeletal muscle tissue.

    PubMed

    Trajano, Larissa Alexsandra da Silva Neto; Trajano, Eduardo Tavares Lima; Silva, Marco Aurélio Dos Santos; Stumbo, Ana Carolina; Mencalha, Andre Luiz; Fonseca, Adenilson de Souza da

    2018-02-01

    Muscle injuries are common, especially in sports and cumulative trauma disorder, and their repair is influenced by free radical formation, which causes damages in lipids, proteins and DNA. Oxidative DNA damages are repaired by base excision repair and nucleotide excision repair, ensuring telomeric and genomic stability. There are few studies on this topic in skeletal muscle cells. This review focuses on base excision repair and nucleotide excision repair, telomere regulation and how telomeric stabilization influences healthy muscle, injured muscle, exercise, and its relationship with aging. In skeletal muscle, genomic stabilization and telomere regulation seem to play an important role in tissue health, influencing muscle injury repair. Thus, therapies targeting mechanisms of DNA repair and telomeric regulation could be new approaches for improving repair and prevention of skeletal muscle injuries in young and old people. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  17. 46 CFR 171.080 - Damage stability standards for vessels with Type I or Type II subdivision.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to which the vessel heels after sustained damage, are swung out if necessary, fully loaded and ready...; and (C) Survival craft on the side of the vessel opposite that to which the vessel heels remain stowed... damage occurs. (3) The estimated maximum angle of heel before equalization must be approved by the...

  18. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  19. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  20. Analysis of sensitivity to different parameterization schemes for a subtropical cyclone

    NASA Astrophysics Data System (ADS)

    Quitián-Hernández, L.; Fernández-González, S.; González-Alemán, J. J.; Valero, F.; Martín, M. L.

    2018-05-01

    A sensitivity analysis to diverse WRF model physical parameterization schemes is carried out during the lifecycle of a Subtropical cyclone (STC). STCs are low-pressure systems that share tropical and extratropical characteristics, with hybrid thermal structures. In October 2014, a STC made landfall in the Canary Islands, causing widespread damage from strong winds and precipitation there. The system began to develop on October 18 and its effects lasted until October 21. Accurate simulation of this type of cyclone continues to be a major challenge because of its rapid intensification and unique characteristics. In the present study, several numerical simulations were performed using the WRF model to do a sensitivity analysis of its various parameterization schemes for the development and intensification of the STC. The combination of parameterization schemes that best simulated this type of phenomenon was thereby determined. In particular, the parameterization combinations that included the Tiedtke cumulus schemes had the most positive effects on model results. Moreover, concerning STC track validation, optimal results were attained when the STC was fully formed and all convective processes stabilized. Furthermore, to obtain the parameterization schemes that optimally categorize STC structure, a verification using Cyclone Phase Space is assessed. Consequently, the combination of parameterizations including the Tiedtke cumulus schemes were again the best in categorizing the cyclone's subtropical structure. For strength validation, related atmospheric variables such as wind speed and precipitable water were analyzed. Finally, the effects of using a deterministic or probabilistic approach in simulating intense convective phenomena were evaluated.

  1. Re-evaluation Of The Shallow Seismicity On Mt Etna Applying Probabilistic Earthquake Location Algorithms.

    NASA Astrophysics Data System (ADS)

    Tuve, T.; Mostaccio, A.; Langer, H. K.; di Grazia, G.

    2005-12-01

    A recent research project carried out together with the Italian Civil Protection concerns the study of amplitude decay laws in various areas on the Italian territory, including Mt Etna. A particular feature of seismic activity is the presence of moderate magnitude earthquakes causing frequently considerable damage in the epicentre areas. These earthquakes are supposed to occur at rather shallow depth, no more than 5 km. Given the geological context, however, these shallow earthquakes would origin in rather weak sedimentary material. In this study we check the reliability of standard earthquake location, in particular with respect to the calculated focal depth, using standard location methods as well as more advanced approaches such as the NONLINLOC software proposed by Lomax et al. (2000) using it with its various options (i.e., Grid Search, Metropolis-Gibbs and Oct-Tree) and 3D velocity model (Cocina et al., 2005). All three options of NONLINLOC gave comparable results with respect to hypocenter locations and quality. Compared to standard locations we note a significant improve of location quality and, in particular a considerable difference of focal depths (in the order of 1.5 - 2 km). However, we cannot find a clear bias towards greater or lower depth. Further analyses concern the assessment of the stability of locations. For this purpose we carry out various Monte Carlo experiments perturbing travel time reading randomly. Further investigations are devoted to possible biases which may arise from the use of an unsuitable velocity model.

  2. Wind-tunnel studies of the effects of simulated damage on the aerodynamic characteristics of airplanes and missiles

    NASA Technical Reports Server (NTRS)

    Spearman, M. L.

    1979-01-01

    In order to assess the effects on static aerodynamic characteristics of battle damage to an aircraft or missile, wind tunnel studies were performed on models from which all or parts of the wing or horizontal or vertical tail had been removed. The effects of damage on the lift, longitudinal stability, lateral stability and directional stability of a swept-wing fighter are presented, along with the effects of wing removal on the control requirements of a delta-wing fighter. Results indicate that the loss of a major part of the vertical tail will probably result in the loss of the aircraft at any speed, while the loss of major parts of the horizontal tail generally results in catastrophic instability at subsonic speeds but, at low supersonic speeds, may allow the aircraft to return to friendly territory before pilot ejection. Major damage to the wing may be sustained without the loss of aircraft or pilot. The loss of some of the aerodynamic surfaces of cruise or surface-to-air missiles may result in catastrophic instability or may permit a ballistic trajectory to be maintained, depending upon the location of the lost surface with respect to the center of gravity of the missile.

  3. Stromal regulation of vessel stability by MMP14 and TGFβ

    PubMed Central

    Sounni, Nor E.; Dehne, Kerstin; van Kempen, Leon; Egeblad, Mikala; Affara, Nesrine I.; Cuevas, Ileana; Wiesen, Jane; Junankar, Simon; Korets, Lidiya; Lee, Jake; Shen, Jennifer; Morrison, Charlotte J.; Overall, Christopher M.; Krane, Stephen M.; Werb, Zena; Boudreau, Nancy; Coussens, Lisa M.

    2010-01-01

    Innate regulatory networks within organs maintain tissue homeostasis and facilitate rapid responses to damage. We identified a novel pathway regulating vessel stability in tissues that involves matrix metalloproteinase 14 (MMP14) and transforming growth factor beta 1 (TGFβ1). Whereas plasma proteins rapidly extravasate out of vasculature in wild-type mice following acute damage, short-term treatment of mice in vivo with a broad-spectrum metalloproteinase inhibitor, neutralizing antibodies to TGFβ1, or an activin-like kinase 5 (ALK5) inhibitor significantly enhanced vessel leakage. By contrast, in a mouse model of age-related dermal fibrosis, where MMP14 activity and TGFβ bioavailability are chronically elevated, or in mice that ectopically express TGFβ in the epidermis, cutaneous vessels are resistant to acute leakage. Characteristic responses to tissue damage are reinstated if the fibrotic mice are pretreated with metalloproteinase inhibitors or TGFβ signaling antagonists. Neoplastic tissues, however, are in a constant state of tissue damage and exhibit altered hemodynamics owing to hyperleaky angiogenic vasculature. In two distinct transgenic mouse tumor models, inhibition of ALK5 further enhanced vascular leakage into the interstitium and facilitated increased delivery of high molecular weight compounds into premalignant tissue and tumors. Taken together, these data define a central pathway involving MMP14 and TGFβ that mediates vessel stability and vascular response to tissue injury. Antagonists of this pathway could be therapeutically exploited to improve the delivery of therapeutics or molecular contrast agents into tissues where chronic damage or neoplastic disease limits their efficient delivery. PMID:20223936

  4. Future Directions and Challenges in Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann

    1997-01-01

    An answer is sought to the question how comes that today, in 1997, after so many years of concentrated research effort, when it comes to designing buckling critical thin walled shells, one cannot do any better than using the rather conservative Lower Bound Design Philosophy of the sixties. It will be shown that with the establishment of Initial Imperfection Data Banks and the introduction of Probabilistic Design Procedures one has, what appears to be, a viable alternative that when used judiciously may lead step by step to improved shell design recommendations.

  5. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate the slope critical acceleration (Ac)x for which a prefixed probability exists that seismic shaking would result in a Dn value equal to a threshold x whose exceedence would cause landslide triggering. The obtained ac values represent the minimum slope resistance required to keep the probability of seismic-landslide triggering within the prefixed value. In particular we calculated the spatial distribution of (Ac)x for x thresholds of 10 and 2 cm in order to represent triggering conditions for coherent slides (e.g., slumps, block slides, slow earth flows) and disrupted slides (e.g., rock falls, rock slides, rock avalanches), respectively. Then we produced a probabilistic national map that shows the spatial distribution of (Ac)10 and (Ac)2, for a 10% probability of exceedence in 50 year, which is a significant level of hazard equal to that commonly used for building codes. The spatial distribution of the calculated (Ac)xvalues can be compared with the in situ actual ac values of specific slopes to estimate whether these slopes have a significant probability of failing under seismic action in the future. As example of possible application of this kind of time probabilistic map to hazard estimates, we compared the values obtained for the Manjil region with a GIS map providing spatial distribution of estimated ac values in the same region. The spatial distribution of slopes characterized by ac < (Ac)10 was then compared with the spatial distribution of the major landslides of coherent type triggered by the Manjil earthquake. This comparison provides indications on potential, problems and limits of the experimented approach for the study area. References Cornell, C.A., 1968: Engineering seismic risk analysis, Bull. Seism. Soc. Am., 58, 1583-1606. Del Gaudio V., Wasowski J., & Pierri P., 2003: An approach to time probabilistic evaluation of seismically-induced landslide hazard. Bull Seism. Soc. Am., 93, 557-569. Jibson, R.W., E.L. Harp and J.A. Michael, 1998: A method for producing digital probabilistic seismic landslide hazard maps: an example from the Los Angeles, California, area, U.S. Geological Survey Open-File Report 98-113, Golden, Colorado, 17 pp.

  6. DNA damage induces down-regulation of Prp19 via impairing Prp19 stability in hepatocellular carcinoma cells.

    PubMed

    Yin, Jie; Zhang, Yi-An; Liu, Tao-Tao; Zhu, Ji-Min; Shen, Xi-Zhong

    2014-01-01

    Pre-mRNA processing factor 19 (Prp19) activates pre-mRNA spliceosome and also mediates DNA damage response. Prp19 overexpression in cells with functional p53 leads to decreased apoptosis and increases cell survival after DNA damage. Here we showed that in hepatocellular carcinoma (HCC) cells with inactive p53 or functional p53, Prp19 was down-regulated due to the impaired stability under chemotherapeutic drug treatment. Silencing Prp19 expression enhanced apoptosis of HCC cells with or without chemotherapeutic drug treatment. Furthermore high level of Prp19 may inhibit chemotherapeutic drugs induced apoptosis in hepatocellular carcinoma cells through modulating myeloid leukemia cell differentiation 1 expression. These results indicated that targeting Prp19 may potentiate pro-apoptotic effect of chemotherapeutic agents on HCC.

  7. The mast cell stabilizer sodium cromoglycate reduces histamine release and status epilepticus-induced neuronal damage in the rat hippocampus.

    PubMed

    Valle-Dorado, María Guadalupe; Santana-Gómez, César Emmanuel; Orozco-Suárez, Sandra Adela; Rocha, Luisa

    2015-05-01

    Experiments were designed to evaluate changes in the histamine release, mast cell number and neuronal damage in hippocampus induced by status epilepticus. We also evaluated if sodium cromoglycate, a stabilizer of mast cells with a possible stabilizing effect on the membrane of neurons, was able to prevent the release of histamine, γ-aminobutyric acid (GABA) and glutamate during the status epilepticus. During microdialysis experiments, rats were treated with saline (SS-SE) or sodium cromoglycate (CG-SE) and 30 min later received the administration of pilocarpine to induce status epilepticus. Twenty-four hours after the status epilepticus, the brains were used to determine the neuronal damage and the number of mast cells in hippocampus. During the status epilepticus, SS-SE group showed an enhanced release of histamine (138.5%, p = 0.005), GABA (331 ± 91%, p ≤ 0.001) and glutamate (467%, p ≤ 0.001), even after diazepam administration. One day after the status epilepticus, SS-SE group demonstrated increased number of mast cells in Stratum pyramidale of CA1 (88%, p < 0.001) and neuronal damage in dentate gyrus, CA1 and CA3. In contrast to SS-SE group, rats from the CG-SE group showed increased latency to the establishment of the status epilepticus (p = 0.048), absence of wet-dog shakes, reduced histamine (but not GABA and glutamate) release, lower number of mast cells (p = 0.008) and reduced neuronal damage in hippocampus. Our data revealed that histamine, possibly from mast cells, is released in hippocampus during the status epilepticus. This effect may be involved in the subsequent neuronal damage and is diminished with sodium cromoglycate pretreatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Brain MRI fiber-tracking reveals white matter alterations in hypertensive patients without damage at conventional neuroimaging.

    PubMed

    Carnevale, Lorenzo; D'Angelosante, Valentina; Landolfi, Alessandro; Grillea, Giovanni; Selvetella, Giulio; Storto, Marianna; Lembo, Giuseppe; Carnevale, Daniela

    2018-06-12

    Hypertension is one of the main risk factor for dementia. The subtle damage provoked by chronic high blood pressure in the brain is usually evidenced by conventional magnetic resonance imaging (MRI), in terms of white matter (WM) hyperintensities or cerebral atrophy. However, it is clear that by the time brain damage is visible, it may be too late hampering neurodegeneration. Aim of this study was to characterize a signature of early brain damage induced by hypertension, before the neurodegenerative injury manifests. This work was conducted on hypertensive and normotensive subjects with no sign of structural damage at conventional neuroimaging and no diagnosis of dementia revealed by neuropsychological assessment. All individuals underwent cardiological clinical examination in order to define the hypertensive status and the related target organ damage. Additionally, patients were subjected to DTI-MRI scan to identify microstructural damage of WM by probabilistic fiber-tracking. To gain insights in the neurocognitive profile of patients a specific battery of tests was administered. As primary outcome of the study we aimed at finding any specific signature of fiber-tracts alterations in hypertensive patients, associated with an impairment of the related cognitive functions. Hypertensive patients showed significant alterations in three specific WM fiber-tracts: the anterior thalamic radiation, the superior longitudinal fasciculus and the forceps minor. Hypertensive patients also scored significantly worse in the cognitive domains ascribable to brain regions connected through those WM fiber-tracts, showing decreased performances in executive functions, processing speed, memory, and paired associative learning tasks. Overall, WM fiber-tracking on MRI evidenced an early signature of damage in hypertensive patients when otherwise undetectable by conventional neuroimaging. In perspective, this approach could allow identifying those patients that are in initial stages of brain damage and could benefit of therapies aimed at limiting the transition to dementia and neurodegeneration.

  9. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and the CARES/Creep program.

  10. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  11. A probabilistic multidimensional approach to quantify large wood recruitment from hillslopes in mountainous-forested catchments

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Rigon, Emanuel; Lenzi, Mario Aristide; Bischetti, Gian Battista

    2018-04-01

    Large wood (LW) plays a key role in physical, chemical, environmental, and biological processes in most natural and seminatural streams. However, it is also a source of hydraulic hazard in anthropised territories. Recruitment from fluvial processes has been the subject of many studies, whereas less attention has been given to hillslope recruitment, which is linked to episodic and spatially distributed events and requires a reliable and accurate slope stability model and a hillslope-channel transfer model. The purpose of this study is to develop an innovative LW hillslope-recruitment estimation approach that combines forest stand characteristics in a spatially distributed form, a probabilistic multidimensional slope stability model able to include the reinforcement exerted by roots, and a hillslope-channel transfer procedure. The approach was tested on a small mountain headwater catchment in the eastern Italian Alps that is prone to shallow landslide and debris flow phenomena. The slope stability model (that had not been calibrated) provided accurate performances, in terms of unstable areas identification according to the landslide inventory (AUC = 0.832) and of LW volume estimation in comparison with LW volume produced by inventoried landslides (7702 m3 corresponding to a recurrence time of about 30 years in the susceptibility curve). The results showed that most LW potentially mobilised by landslides does not reach the channel network (only about 16%), in agreement with the few data reported by other studies, as well as the data normalized for unit length of channel and unit length of channel per year (0-116 m3/km and 0-4 m3/km y-1). This study represents an important contribution to LW research. A rigorous and site-specific estimation of LW hillslope recruitment should, in fact, be an integral part of more general studies on LW dynamics, for forest planning and management, and positioning in-channel wood retention structures.

  12. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  13. Estimating earthquake-induced failure probability and downtime of critical facilities.

    PubMed

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.

  14. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  15. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grivas, D.A.; Schultz, B.C.; O`Neil, G.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associatedmore » with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.« less

  16. Coastal flood implications of 1.5°C, 2°C and 2.5°C global mean temperature stabilization targets for small island nations

    NASA Astrophysics Data System (ADS)

    Rasmussen, D.; Buchanan, M. K.; Kopp, R. E.; Oppenheimer, M.

    2017-12-01

    Sea-level rise (SLR) is magnifying the frequency and severity of flooding in coastal regions. The rate and amount of global-mean SLR is a function of the trajectory of the global mean surface temperature (GMST). Therefore, temperature stabilization targets (e.g., 1.5°C or 2°C, as from the Paris Agreement) have important implications for regulating coastal flood risk. Quantifying the differences in the impact from SLR between these and other GMST stabilization targets is necessary for assessing the benefits and harms of mitigation goals. Low-lying small island nations are particularly vulnerable to inundation and coastal flooding from SLR because building protective and resilient infrastructure may not be physically or economically feasible. For small island nations, keeping GMST below a specified threshold may be the only option for maintaining habitability. Here, we assess differences in the return levels of coastal floods for small island nations between 1.5°C, 2.0°C, and 2.5°C GMST stabilization. We employ probabilistic, localized SLR projections and long-term hourly tide gauge records to construct estimates of local flood risk. We then estimate the number of small island nations' inhabitants at risk for permanent inundation under different GMST stabilization targets.

  17. Effects of various asphalt binder additives/modifiers on moisture susceptible asphaltic mixtures.

    DOT National Transportation Integrated Search

    2014-01-01

    Moisture damage of asphalt concrete is defined as the loss of strength and stability caused by the active presence of : moisture. The most common technique to mitigate moisture damage is using additives or modifiers with the asphalt binder or : the a...

  18. The application of encapsulation material stability data to photovoltaic module life assessment

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1983-01-01

    For any piece of hardware that degrades when subject to environmental and application stresses, the route or sequence that describes the degradation process may be summarized in terms of six key words: LOADS, RESPONSE, CHANGE, DAMAGE, FAILURE, and PENALTY. Applied to photovoltaic modules, these six factors form the core outline of an expanded failure analysis matrix for unifying and integrating relevant material degradation data and analyses. An important feature of this approach is the deliberate differentiation between factors such as CHANGE, DAMAGE, and FAILURE. The application of this outline to materials degradation research facilitates the distinction between quantifying material property changes and quantifying module damage or power loss with their economic consequences. The approach recommended for relating material stability data to photovoltaic module life is to use the degree of DAMAGE to (1) optical coupling, (2) encapsulant package integrity, (3) PV circuit integrity or (4) electrical isolation as the quantitative criterion for assessing module potential service life rather than simply using module power loss.

  19. XRN2 Links Transcription Termination to DNA Damage and Replication Stress

    PubMed Central

    Patidar, Praveen L.; Motea, Edward A.; Dang, Tuyen T.; Manley, James L.

    2016-01-01

    XRN2 is a 5’-3’ exoribonuclease implicated in transcription termination. Here we demonstrate an unexpected role for XRN2 in the DNA damage response involving resolution of R-loop structures and prevention of DNA double-strand breaks (DSBs). We show that XRN2 undergoes DNA damage-inducible nuclear re-localization, co-localizing with 53BP1 and R loops, in a transcription and R-loop-dependent process. XRN2 loss leads to increased R loops, genomic instability, replication stress, DSBs and hypersensitivity of cells to various DNA damaging agents. We demonstrate that the DSBs that arise with XRN2 loss occur at transcriptional pause sites. XRN2-deficient cells also exhibited an R-loop- and transcription-dependent delay in DSB repair after ionizing radiation, suggesting a novel role for XRN2 in R-loop resolution, suppression of replication stress, and maintenance of genomic stability. Our study highlights the importance of regulating transcription-related activities as a critical component in maintaining genetic stability. PMID:27437695

  20. XRN2 Links Transcription Termination to DNA Damage and Replication Stress.

    PubMed

    Morales, Julio C; Richard, Patricia; Patidar, Praveen L; Motea, Edward A; Dang, Tuyen T; Manley, James L; Boothman, David A

    2016-07-01

    XRN2 is a 5'-3' exoribonuclease implicated in transcription termination. Here we demonstrate an unexpected role for XRN2 in the DNA damage response involving resolution of R-loop structures and prevention of DNA double-strand breaks (DSBs). We show that XRN2 undergoes DNA damage-inducible nuclear re-localization, co-localizing with 53BP1 and R loops, in a transcription and R-loop-dependent process. XRN2 loss leads to increased R loops, genomic instability, replication stress, DSBs and hypersensitivity of cells to various DNA damaging agents. We demonstrate that the DSBs that arise with XRN2 loss occur at transcriptional pause sites. XRN2-deficient cells also exhibited an R-loop- and transcription-dependent delay in DSB repair after ionizing radiation, suggesting a novel role for XRN2 in R-loop resolution, suppression of replication stress, and maintenance of genomic stability. Our study highlights the importance of regulating transcription-related activities as a critical component in maintaining genetic stability.

  1. The tumour suppressor CYLD regulates the p53 DNA damage response

    PubMed Central

    Fernández-Majada, Vanesa; Welz, Patrick-Simon; Ermolaeva, Maria A.; Schell, Michael; Adam, Alexander; Dietlein, Felix; Komander, David; Büttner, Reinhard; Thomas, Roman K.; Schumacher, Björn; Pasparakis, Manolis

    2016-01-01

    The tumour suppressor CYLD is a deubiquitinase previously shown to inhibit NF-κB, MAP kinase and Wnt signalling. However, the tumour suppressing mechanisms of CYLD remain poorly understood. Here we show that loss of CYLD catalytic activity causes impaired DNA damage-induced p53 stabilization and activation in epithelial cells and sensitizes mice to chemical carcinogen-induced intestinal and skin tumorigenesis. Mechanistically, CYLD interacts with and deubiquitinates p53 facilitating its stabilization in response to genotoxic stress. Ubiquitin chain-restriction analysis provides evidence that CYLD removes K48 ubiquitin chains from p53 indirectly by cleaving K63 linkages, suggesting that p53 is decorated with complex K48/K63 chains. Moreover, CYLD deficiency also diminishes CEP-1/p53-dependent DNA damage-induced germ cell apoptosis in the nematode Caenorhabditis elegans. Collectively, our results identify CYLD as a deubiquitinase facilitating DNA damage-induced p53 activation and suggest that regulation of p53 responses to genotoxic stress contributes to the tumour suppressor function of CYLD. PMID:27561390

  2. Erythrocyte membrane stability to hydrogen peroxide is decreased in Alzheimer disease.

    PubMed

    Gilca, Marilena; Lixandru, Daniela; Gaman, Laura; Vîrgolici, Bogdana; Atanasiu, Valeriu; Stoian, Irina

    2014-01-01

    The brain and erythrocytes have similar susceptibility toward free radicals. Therefore, erythrocyte abnormalities might indicate the progression of the oxidative damage in Alzheimer disease (AD). The aim of this study was to investigate erythrocyte membrane stability and plasma antioxidant status in AD. Fasting blood samples (from 17 patients with AD and 14 healthy controls) were obtained and erythrocyte membrane stability against hydrogen peroxide and 2,2'-azobis-(2-amidinopropane) dihydrochloride (AAPH), serum Trolox equivalent antioxidant capacity (TEAC), residual antioxidant activity or gap (GAP), erythrocyte catalase activity (CAT), erythrocyte superoxide dismutase (SOD) activity, erythrocyte nonproteic thiols, and total plasma thiols were determined. A significant decrease in erythrocyte membrane stability to hydrogen peroxide was found in AD patients when compared with controls (P<0.05). On the contrary, CAT activity (P<0.0001) and total plasma thiols (P<0.05) were increased in patients with AD compared with controls. Our results indicate that the most satisfactory measurement of the oxidative stress level in the blood of patients with AD is the erythrocyte membrane stability to hydrogen peroxide. Reduced erythrocyte membrane stability may be further evaluated as a potential peripheral marker for oxidative damage in AD.

  3. Drosophila MOF controls Checkpoint protein2 and regulates genomic stability during early embryogenesis

    PubMed Central

    2013-01-01

    Background In Drosophila embryos, checkpoints maintain genome stability by delaying cell cycle progression that allows time for damage repair or to complete DNA synthesis. Drosophila MOF, a member of MYST histone acetyl transferase is an essential component of male X hyperactivation process. Until recently its involvement in G2/M cell cycle arrest and defects in ionizing radiation induced DNA damage pathways was not well established. Results Drosophila MOF is highly expressed during early embryogenesis. In the present study we show that haplo-insufficiency of maternal MOF leads to spontaneous mitotic defects like mitotic asynchrony, mitotic catastrophe and chromatid bridges in the syncytial embryos. Such abnormal nuclei are eliminated and digested in the yolk tissues by nuclear fall out mechanism. MOF negatively regulates Drosophila checkpoint kinase 2 tumor suppressor homologue. In response to DNA damage the checkpoint gene Chk2 (Drosophila mnk) is activated in the mof mutants, there by causing centrosomal inactivation suggesting its role in response to genotoxic stress. A drastic decrease in the fall out nuclei in the syncytial embryos derived from mof1/+; mnkp6/+ females further confirms the role of DNA damage response gene Chk2 to ensure the removal of abnormal nuclei from the embryonic precursor pool and maintain genome stability. The fact that mof mutants undergo DNA damage has been further elucidated by the increased number of single and double stranded DNA breaks. Conclusion mof mutants exhibited genomic instability as evidenced by the occurance of frequent mitotic bridges in anaphase, asynchronous nuclear divisions, disruption of cytoskeleton, inactivation of centrosomes finally leading to DNA damage. Our findings are consistent to what has been reported earlier in mammals that; reduced levels of MOF resulted in increased genomic instability while total loss resulted in lethality. The study can be further extended using Drosophila as model system and carry out the interaction of MOF with the known components of the DNA damage pathway. PMID:23347679

  4. Drosophila MOF controls Checkpoint protein2 and regulates genomic stability during early embryogenesis.

    PubMed

    Pushpavalli, Sreerangam N C V L; Sarkar, Arpita; Ramaiah, M Janaki; Chowdhury, Debabani Roy; Bhadra, Utpal; Pal-Bhadra, Manika

    2013-01-24

    In Drosophila embryos, checkpoints maintain genome stability by delaying cell cycle progression that allows time for damage repair or to complete DNA synthesis. Drosophila MOF, a member of MYST histone acetyl transferase is an essential component of male X hyperactivation process. Until recently its involvement in G2/M cell cycle arrest and defects in ionizing radiation induced DNA damage pathways was not well established. Drosophila MOF is highly expressed during early embryogenesis. In the present study we show that haplo-insufficiency of maternal MOF leads to spontaneous mitotic defects like mitotic asynchrony, mitotic catastrophe and chromatid bridges in the syncytial embryos. Such abnormal nuclei are eliminated and digested in the yolk tissues by nuclear fall out mechanism. MOF negatively regulates Drosophila checkpoint kinase 2 tumor suppressor homologue. In response to DNA damage the checkpoint gene Chk2 (Drosophila mnk) is activated in the mof mutants, there by causing centrosomal inactivation suggesting its role in response to genotoxic stress. A drastic decrease in the fall out nuclei in the syncytial embryos derived from mof¹/+; mnkp⁶/+ females further confirms the role of DNA damage response gene Chk2 to ensure the removal of abnormal nuclei from the embryonic precursor pool and maintain genome stability. The fact that mof mutants undergo DNA damage has been further elucidated by the increased number of single and double stranded DNA breaks. mof mutants exhibited genomic instability as evidenced by the occurance of frequent mitotic bridges in anaphase, asynchronous nuclear divisions, disruption of cytoskeleton, inactivation of centrosomes finally leading to DNA damage. Our findings are consistent to what has been reported earlier in mammals that; reduced levels of MOF resulted in increased genomic instability while total loss resulted in lethality. The study can be further extended using Drosophila as model system and carry out the interaction of MOF with the known components of the DNA damage pathway.

  5. Shielded-Twisted-Pair Cable Model for Chafe Fault Detection via Time-Domain Reflectometry

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2012-01-01

    This report details the development, verification, and validation of an innovative physics-based model of electrical signal propagation through shielded-twisted-pair cable, which is commonly found on aircraft and offers an ideal proving ground for detection of small holes in a shield well before catastrophic damage occurs. The accuracy of this model is verified through numerical electromagnetic simulations using a commercially available software tool. The model is shown to be representative of more realistic (analytically intractable) cable configurations as well. A probabilistic framework is developed for validating the model accuracy with reflectometry data obtained from real aircraft-grade cables chafed in the laboratory.

  6. Applicability of Linear Analysis in Probabilistic Estimation of Seismic Building Damage to Reinforced-Concrete Structures

    DTIC Science & Technology

    2012-06-01

    70 10 Figure 27: Time history for Imperial Valley 10/15/79 23:16 Chihuahua earthquake ................................. 70 Figure 28... Chihuahua earthquake -0.003 g -0.002 g -0.001 g 0.000 g 0.001 g 0.002 g 0.003 g 0 5 10 15 20 25 30 Time (Seconds) San Fernando 02/09/71 14:00 LA...Hollywood Stor Lot -0.300 g -0.200 g -0.100 g 0.000 g 0.100 g 0.200 g 0.300 g 0 10 20 30 40 50 Time (Seconds) Imperial Valley 10/15/79 23:16 Chihuahua 71

  7. A machine-learning approach for damage detection in aircraft structures using self-powered sensor data

    NASA Astrophysics Data System (ADS)

    Salehi, Hadi; Das, Saptarshi; Chakrabartty, Shantanu; Biswas, Subir; Burgueño, Rigoberto

    2017-04-01

    This study proposes a novel strategy for damage identification in aircraft structures. The strategy was evaluated based on the simulation of the binary data generated from self-powered wireless sensors employing a pulse switching architecture. The energy-aware pulse switching communication protocol uses single pulses instead of multi-bit packets for information delivery resulting in discrete binary data. A system employing this energy-efficient technology requires dealing with time-delayed binary data due to the management of power budgets for sensing and communication. This paper presents an intelligent machine-learning framework based on combination of the low-rank matrix decomposition and pattern recognition (PR) methods. Further, data fusion is employed as part of the machine-learning framework to take into account the effect of data time delay on its interpretation. Simulated time-delayed binary data from self-powered sensors was used to determine damage indicator variables. Performance and accuracy of the damage detection strategy was examined and tested for the case of an aircraft horizontal stabilizer. Damage states were simulated on a finite element model by reducing stiffness in a region of the stabilizer's skin. The proposed strategy shows satisfactory performance to identify the presence and location of the damage, even with noisy and incomplete data. It is concluded that PR is a promising machine-learning algorithm for damage detection for time-delayed binary data from novel self-powered wireless sensors.

  8. Radiation damage in cubic ZrO 2 and yttria-stabilized zirconia from molecular dynamics simulations

    DOE PAGES

    Aidhy, Dilpuneet S.; Zhang, Yanwen; Weber, William J.

    2014-11-20

    Here, we perform molecular dynamics simulation on cubic ZrO 2 and yttria-stabilized zirconia (YSZ) to elucidate defect cluster formation resulting from radiation damage, and evaluate the impact of Y-dopants. Interstitial clusters composed of split-interstitial building blocks, i.e., Zr-Zr or Y-Zr are formed. Moreover, oxygen vacancies control cation defect migration; in their presence, Zr interstitials aggregate to form split-interstitials whereas in their absence Zr interstitials remain immobile, as isolated single-interstitials. Y-doping prevents interstitial cluster formation due to sequestration of oxygen vacancies.

  9. Robust evaluation of time series classification algorithms for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Worden, Keith; Todd, Michael D.

    2014-03-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and mechanical infrastructure through analysis of structural response measurements. The supervised learning methodology for data-driven SHM involves computation of low-dimensional, damage-sensitive features from raw measurement data that are then used in conjunction with machine learning algorithms to detect, classify, and quantify damage states. However, these systems often suffer from performance degradation in real-world applications due to varying operational and environmental conditions. Probabilistic approaches to robust SHM system design suffer from incomplete knowledge of all conditions a system will experience over its lifetime. Info-gap decision theory enables nonprobabilistic evaluation of the robustness of competing models and systems in a variety of decision making applications. Previous work employed info-gap models to handle feature uncertainty when selecting various components of a supervised learning system, namely features from a pre-selected family and classifiers. In this work, the info-gap framework is extended to robust feature design and classifier selection for general time series classification through an efficient, interval arithmetic implementation of an info-gap data model. Experimental results are presented for a damage type classification problem on a ball bearing in a rotating machine. The info-gap framework in conjunction with an evolutionary feature design system allows for fully automated design of a time series classifier to meet performance requirements under maximum allowable uncertainty.

  10. "What--me worry?" "Why so serious?": a personal view on the Fukushima nuclear reactor accidents.

    PubMed

    Gallucci, Raymond

    2012-09-01

    Infrequently, it seems that a significant accident precursor or, worse, an actual accident, involving a commercial nuclear power reactor occurs to remind us of the need to reexamine the safety of this important electrical power technology from a risk perspective. Twenty-five years since the major core damage accident at Chernobyl in the Ukraine, the Fukushima reactor complex in Japan experienced multiple core damages as a result of an earthquake-induced tsunami beyond either the earthquake or tsunami design basis for the site. Although the tsunami itself killed tens of thousands of people and left the area devastated and virtually uninhabitable, much concern still arose from the potential radioactive releases from the damaged reactors, even though there was little population left in the area to be affected. As a lifelong probabilistic safety analyst in nuclear engineering, even I must admit to a recurrence of the doubt regarding nuclear power safety after Fukushima that I had experienced after Three Mile Island and Chernobyl. This article is my attempt to "recover" my personal perspective on acceptable risk by examining both the domestic and worldwide history of commercial nuclear power plant accidents and attempting to quantify the risk in terms of the frequency of core damage that one might glean from a review of operational history. © 2012 Society for Risk Analysis.

  11. Alcohol-induced one-carbon metabolism impairment promotes dysfunction of DNA base excision repair in adult brain.

    PubMed

    Fowler, Anna-Kate; Hewetson, Aveline; Agrawal, Rajiv G; Dagda, Marisela; Dagda, Raul; Moaddel, Ruin; Balbo, Silvia; Sanghvi, Mitesh; Chen, Yukun; Hogue, Ryan J; Bergeson, Susan E; Henderson, George I; Kruman, Inna I

    2012-12-21

    The brain is one of the major targets of chronic alcohol abuse. Yet the fundamental mechanisms underlying alcohol-mediated brain damage remain unclear. The products of alcohol metabolism cause DNA damage, which in conditions of DNA repair dysfunction leads to genomic instability and neural death. We propose that one-carbon metabolism (OCM) impairment associated with long term chronic ethanol intake is a key factor in ethanol-induced neurotoxicity, because OCM provides cells with DNA precursors for DNA repair and methyl groups for DNA methylation, both critical for genomic stability. Using histological (immunohistochemistry and stereological counting) and biochemical assays, we show that 3-week chronic exposure of adult mice to 5% ethanol (Lieber-Decarli diet) results in increased DNA damage, reduced DNA repair, and neuronal death in the brain. These were concomitant with compromised OCM, as evidenced by elevated homocysteine, a marker of OCM dysfunction. We conclude that OCM dysfunction plays a causal role in alcohol-induced genomic instability in the brain because OCM status determines the alcohol effect on DNA damage/repair and genomic stability. Short ethanol exposure, which did not disturb OCM, also did not affect the response to DNA damage, whereas additional OCM disturbance induced by deficiency in a key OCM enzyme, methylenetetrahydrofolate reductase (MTHFR) in Mthfr(+/-) mice, exaggerated the ethanol effect on DNA repair. Thus, the impact of long term ethanol exposure on DNA repair and genomic stability in the brain results from OCM dysfunction, and MTHFR mutations such as Mthfr 677C→T, common in human population, may exaggerate the adverse effects of ethanol on the brain.

  12. Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David

    2015-07-01

    Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less

  13. Mutations in DONSON disrupt replication fork stability and cause microcephalic dwarfism.

    PubMed

    Reynolds, John J; Bicknell, Louise S; Carroll, Paula; Higgs, Martin R; Shaheen, Ranad; Murray, Jennie E; Papadopoulos, Dimitrios K; Leitch, Andrea; Murina, Olga; Tarnauskaitė, Žygimantė; Wessel, Sarah R; Zlatanou, Anastasia; Vernet, Audrey; von Kriegsheim, Alex; Mottram, Rachel M A; Logan, Clare V; Bye, Hannah; Li, Yun; Brean, Alexander; Maddirevula, Sateesh; Challis, Rachel C; Skouloudaki, Kassiani; Almoisheer, Agaadir; Alsaif, Hessa S; Amar, Ariella; Prescott, Natalie J; Bober, Michael B; Duker, Angela; Faqeih, Eissa; Seidahmed, Mohammed Zain; Al Tala, Saeed; Alswaid, Abdulrahman; Ahmed, Saleem; Al-Aama, Jumana Yousuf; Altmüller, Janine; Al Balwi, Mohammed; Brady, Angela F; Chessa, Luciana; Cox, Helen; Fischetto, Rita; Heller, Raoul; Henderson, Bertram D; Hobson, Emma; Nürnberg, Peter; Percin, E Ferda; Peron, Angela; Spaccini, Luigina; Quigley, Alan J; Thakur, Seema; Wise, Carol A; Yoon, Grace; Alnemer, Maha; Tomancak, Pavel; Yigit, Gökhan; Taylor, A Malcolm R; Reijns, Martin A M; Simpson, Michael A; Cortez, David; Alkuraya, Fowzan S; Mathew, Christopher G; Jackson, Andrew P; Stewart, Grant S

    2017-04-01

    To ensure efficient genome duplication, cells have evolved numerous factors that promote unperturbed DNA replication and protect, repair and restart damaged forks. Here we identify downstream neighbor of SON (DONSON) as a novel fork protection factor and report biallelic DONSON mutations in 29 individuals with microcephalic dwarfism. We demonstrate that DONSON is a replisome component that stabilizes forks during genome replication. Loss of DONSON leads to severe replication-associated DNA damage arising from nucleolytic cleavage of stalled replication forks. Furthermore, ATM- and Rad3-related (ATR)-dependent signaling in response to replication stress is impaired in DONSON-deficient cells, resulting in decreased checkpoint activity and the potentiation of chromosomal instability. Hypomorphic mutations in DONSON substantially reduce DONSON protein levels and impair fork stability in cells from patients, consistent with defective DNA replication underlying the disease phenotype. In summary, we have identified mutations in DONSON as a common cause of microcephalic dwarfism and established DONSON as a critical replication fork protein required for mammalian DNA replication and genome stability.

  14. DNA damage during S-phase mediates the proliferation-quiescence decision in the subsequent G1 via p21 expression

    PubMed Central

    Barr, Alexis R.; Cooper, Samuel; Heldt, Frank S.; Butera, Francesca; Stoy, Henriette; Mansfeld, Jörg; Novák, Béla; Bakal, Chris

    2017-01-01

    Following DNA damage caused by exogenous sources, such as ionizing radiation, the tumour suppressor p53 mediates cell cycle arrest via expression of the CDK inhibitor, p21. However, the role of p21 in maintaining genomic stability in the absence of exogenous DNA-damaging agents is unclear. Here, using live single-cell measurements of p21 protein in proliferating cultures, we show that naturally occurring DNA damage incurred over S-phase causes p53-dependent accumulation of p21 during mother G2- and daughter G1-phases. High p21 levels mediate G1 arrest via CDK inhibition, yet lower levels have no impact on G1 progression, and the ubiquitin ligases CRL4Cdt2 and SCFSkp2 couple to degrade p21 prior to the G1/S transition. Mathematical modelling reveals that a bistable switch, created by CRL4Cdt2, promotes irreversible S-phase entry by keeping p21 levels low, preventing premature S-phase exit upon DNA damage. Thus, we characterize how p21 regulates the proliferation-quiescence decision to maintain genomic stability. PMID:28317845

  15. 30 CFR 816.99 - Slides and other damage.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... § 816.99 Slides and other damage. (a) An undisturbed natural barrier shall be provided beginning at the... determined by the regulatory authority as is needed to assure stability. The barrier shall be retained in... affect on public property, health, safety, or the environment, the person who conducts the surface mining...

  16. 33 CFR Appendix B to Part 157 - Subdivision and Stability Assumptions

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... liquids from damaged compartments. (b) The permeabilities are assumed as follows: Intended space use... space located aft is involved in the damage assumption. The machinery space is calculated as a single... between adjacent transverse bulkheads except the machinery space. (b) The extent and the character of the...

  17. 33 CFR Appendix B to Part 157 - Subdivision and Stability Assumptions

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... liquids from damaged compartments. (b) The permeabilities are assumed as follows: Intended space use... space located aft is involved in the damage assumption. The machinery space is calculated as a single... between adjacent transverse bulkheads except the machinery space. (b) The extent and the character of the...

  18. 33 CFR Appendix B to Part 157 - Subdivision and Stability Assumptions

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... liquids from damaged compartments. (b) The permeabilities are assumed as follows: Intended space use... space located aft is involved in the damage assumption. The machinery space is calculated as a single... between adjacent transverse bulkheads except the machinery space. (b) The extent and the character of the...

  19. 33 CFR Appendix B to Part 157 - Subdivision and Stability Assumptions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... liquids from damaged compartments. (b) The permeabilities are assumed as follows: Intended space use... space located aft is involved in the damage assumption. The machinery space is calculated as a single... between adjacent transverse bulkheads except the machinery space. (b) The extent and the character of the...

  20. 33 CFR Appendix B to Part 157 - Subdivision and Stability Assumptions

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... liquids from damaged compartments. (b) The permeabilities are assumed as follows: Intended space use... space located aft is involved in the damage assumption. The machinery space is calculated as a single... between adjacent transverse bulkheads except the machinery space. (b) The extent and the character of the...

  1. A Life Cycle Based Approach to Multi-Hazard Risk Assessment

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.

    2017-12-01

    Small craft harbors are important facets to many coastal communities providing a transition from land to ocean. Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. However, tsunamis represent only one of many hazards a harbor is likely to see in California. Other natural hazards including tsunamis, wave attack, storm surge and sea level rise all can damage a harbor but are not typically addressed in traditional risk studies. Existing approaches to assess small craft harbor vulnerably typically look at single events assigning likely damage levels to each event. However, a harbor will likely experience damage from several different types of hazards over its service life with each event contributing proportionally to the total damage state. A new, fully probabilistic risk method will be presented which considers the distribution of return period for various hazards over a harbor's service life. The likelihood of failure is connected to each hazard via vulnerability curves. By simply tabulating the expected damage levels from each event, the method provides a quantitative measure of a harbor's risk to various types of hazards as well as the likelihood of failure (i.e. cumulative risk) during the service life. Crescent City Harbor in Northern California and Kings Harbor in Southern California have been chosen as case studies. Each harbor is dynamically different and were chosen to highlight the strengths and weaknesses of the method. Findings of each study will focus on assisting the stakeholders and decision makers to better understand the relative risk to each harbor with the goal of providing them with a tool to better plan for the future maritime environment.

  2. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  3. [Patient management in polytrauma with injuries of the cervical spine].

    PubMed

    Kohler, A; Friedl, H P; Käch, K; Stocker, R; Trentz, O

    1994-04-01

    Complex unstable cervical spine injuries in polytraumatized patients are stabilized ventro-dorsally in a two-stage procedure. The ventral stabilization is a day-one surgery with the goal to get primary stability for intensive care, early spinal decompression and protection against secondary damage of the spinal cord. The additional dorsal stabilization allows early functional treatment or in case of spinal cord lesions early neurorehabilitation. The combination of severe brain injury and unstable cervical spine injury is especially demanding concerning diagnostic and therapeutic procedures.

  4. Two-dimensional probabilistic inversion of plane-wave electromagnetic data: methodology, model constraints and joint inversion with electrical resistivity data

    NASA Astrophysics Data System (ADS)

    Rosas-Carbajal, Marina; Linde, Niklas; Kalscheuer, Thomas; Vrugt, Jasper A.

    2014-03-01

    Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.

  5. Design of robust reliable control for T-S fuzzy Markovian jumping delayed neutral type neural networks with probabilistic actuator faults and leakage delays: An event-triggered communication scheme.

    PubMed

    Syed Ali, M; Vadivel, R; Saravanakumar, R

    2018-06-01

    This study examines the problem of robust reliable control for Takagi-Sugeno (T-S) fuzzy Markovian jumping delayed neural networks with probabilistic actuator faults and leakage terms. An event-triggered communication scheme. First, the randomly occurring actuator faults and their failures rates are governed by two sets of unrelated random variables satisfying certain probabilistic failures of every actuator, new type of distribution based event triggered fault model is proposed, which utilize the effect of transmission delay. Second, Takagi-Sugeno (T-S) fuzzy model is adopted for the neural networks and the randomness of actuators failures is modeled in a Markov jump model framework. Third, to guarantee the considered closed-loop system is exponential mean square stable with a prescribed reliable control performance, a Markov jump event-triggered scheme is designed in this paper, which is the main purpose of our study. Fourth, by constructing appropriate Lyapunov-Krasovskii functional, employing Newton-Leibniz formulation and integral inequalities, several delay-dependent criteria for the solvability of the addressed problem are derived. The obtained stability criteria are stated in terms of linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Finally, numerical examples are given to illustrate the effectiveness and reduced conservatism of the proposed results over the existing ones, among them one example was supported by real-life application of the benchmark problem. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  6. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  7. Superparamagnetic perpendicular magnetic tunnel junctions for true random number generators

    NASA Astrophysics Data System (ADS)

    Parks, Bradley; Bapna, Mukund; Igbokwe, Julianne; Almasi, Hamid; Wang, Weigang; Majetich, Sara A.

    2018-05-01

    Superparamagnetic perpendicular magnetic tunnel junctions are fabricated and analyzed for use in random number generators. Time-resolved resistance measurements are used as streams of bits in statistical tests for randomness. Voltage control of the thermal stability enables tuning the average speed of random bit generation up to 70 kHz in a 60 nm diameter device. In its most efficient operating mode, the device generates random bits at an energy cost of 600 fJ/bit. A narrow range of magnetic field tunes the probability of a given state from 0 to 1, offering a means of probabilistic computing.

  8. Wheelock during Expedition 16/STS-120 EVA 4

    NASA Image and Video Library

    2007-11-03

    ISS016-E-009179 (3 Nov. 2007) --- Astronaut Doug Wheelock, STS-120 mission specialist, participates in the mission's fourth session of extravehicular activity (EVA) while Space Shuttle Discovery is docked with the International Space Station. During the 7-hour, 19-minute spacewalk, astronaut Scott Parazynski (out of frame), mission specialist, cut a snagged wire and installed homemade stabilizers designed to strengthen the damaged solar array's structure and stability in the vicinity of the damage. Wheelock assisted from the truss by keeping an eye on the distance between Parazynski and the array. Once the repair was complete, flight controllers on the ground successfully completed the deployment of the array.

  9. Wheelock during Expedition 16/STS-120 EVA 4

    NASA Image and Video Library

    2007-11-03

    ISS016-E-009192 (3 Nov. 2007) --- Astronaut Doug Wheelock, STS-120 mission specialist, participates in the mission's fourth session of extravehicular activity (EVA) while Space Shuttle Discovery is docked with the International Space Station. During the 7-hour, 19-minute spacewalk, astronaut Scott Parazynski (out of frame), mission specialist, cut a snagged wire and installed homemade stabilizers designed to strengthen the damaged solar array's structure and stability in the vicinity of the damage. Wheelock assisted from the truss by keeping an eye on the distance between Parazynski and the array. Once the repair was complete, flight controllers on the ground successfully completed the deployment of the array.

  10. Nucleolus as an emerging hub in maintenance of genome stability and cancer pathogenesis.

    PubMed

    Lindström, Mikael S; Jurada, Deana; Bursac, Sladana; Orsolic, Ines; Bartek, Jiri; Volarevic, Sinisa

    2018-05-01

    The nucleolus is the major site for synthesis of ribosomes, complex molecular machines that are responsible for protein synthesis. A wealth of research over the past 20 years has clearly indicated that both quantitative and qualitative alterations in ribosome biogenesis can drive the malignant phenotype via dysregulation of protein synthesis. However, numerous recent proteomic, genomic, and functional studies have implicated the nucleolus in the regulation of processes that are unrelated to ribosome biogenesis, including DNA-damage response, maintenance of genome stability and its spatial organization, epigenetic regulation, cell-cycle control, stress responses, senescence, global gene expression, as well as assembly or maturation of various ribonucleoprotein particles. In this review, the focus will be on features of rDNA genes, which make them highly vulnerable to DNA damage and intra- and interchromosomal recombination as well as built-in mechanisms that prevent and repair rDNA damage, and how dysregulation of this interplay affects genome-wide DNA stability, gene expression and the balance between euchromatin and heterochromatin. We will also present the most recent insights into how malfunction of these cellular processes may be a central driving force of human malignancies, and propose a promising new therapeutic approach for the treatment of cancer.

  11. Relative binding affinity of carboxylate-, phosphonate-, and bisphosphonate-functionalized gold nanoparticles targeted to damaged bone tissue

    NASA Astrophysics Data System (ADS)

    Ross, Ryan D.; Cole, Lisa E.; Roeder, Ryan K.

    2012-10-01

    Functionalized Au NPs have received considerable recent interest for targeting and labeling cells and tissues. Damaged bone tissue can be targeted by functionalizing Au NPs with molecules exhibiting affinity for calcium. Therefore, the relative binding affinity of Au NPs surface functionalized with either carboxylate ( l-glutamic acid), phosphonate (2-aminoethylphosphonic acid), or bisphosphonate (alendronate) was investigated for targeted labeling of damaged bone tissue in vitro. Targeted labeling of damaged bone tissue was qualitatively verified by visual observation and backscattered electron microscopy, and quantitatively measured by the surface density of Au NPs using field-emission scanning electron microscopy. The surface density of functionalized Au NPs was significantly greater within damaged tissue compared to undamaged tissue for each functional group. Bisphosphonate-functionalized Au NPs exhibited a greater surface density labeling damaged tissue compared to glutamic acid- and phosphonic acid-functionalized Au NPs, which was consistent with the results of previous work comparing the binding affinity of the same functionalized Au NPs to synthetic hydroxyapatite crystals. Targeted labeling was enabled not only by the functional groups but also by the colloidal stability in solution. Functionalized Au NPs were stabilized by the presence of the functional groups, and were shown to remain well dispersed in ionic (phosphate buffered saline) and serum (fetal bovine serum) solutions for up to 1 week. Therefore, the results of this study suggest that bisphosphonate-functionalized Au NPs have potential for targeted delivery to damaged bone tissue in vitro and provide motivation for in vivo investigation.

  12. Lipid stability in meat and meat products.

    PubMed

    Morrissey, P A; Sheehy, P J; Galvin, K; Kerry, J P; Buckley, D J

    1998-01-01

    Lipid oxidation is one of the main factors limiting the quality and acceptability of meats and meat products. Oxidative damage to lipids occurs in the living animal because of an imbalance between the production of reactive oxygen species and the animal's defence mechanisms. This may be brought about by a high intake of oxidized lipids or poly-unsaturated fatty acids, or a low intake of nutrients involved in the antioxidant defence system. Damage to lipids may be accentuated in the immediate post-slaughter period and, in particular, during handling, processing, storage and cooking. In recent years, pressure to reduce artificial additive use in foods has led to attempts to increase meat stability by dietary strategies. These include supplementation of animal diets with vitamin E, ascorbic acid, or carotenoids, or withdrawal of trace mineral supplements. Dietary vitamin E supplementation reduces lipid and myoglobin oxidation, and, in certain situations, drip losses in meats. However, vitamin C supplementation appears to have little, if any, beneficial effects on meat stability. The effect of feeding higher levels of carotenoids on meat stability requires further study. Some studies have demonstrated that reducing the iron and copper content of feeds improves meat stability. Post-slaughter carnosine addition may be an effective means of improving lipid stability in processed meats, perhaps in combination with dietary vitamin E supplementation.

  13. Reversing Optical Damage In LiNbO3 Switches

    NASA Technical Reports Server (NTRS)

    Gee, C. M.; Thurmond, G. D.

    1985-01-01

    One symptom of optical damage in Ti-diffused LiNbO3 directional-coupler switch reversed by temporarily raising input illumination to higher-thannormal power level. Healing phenomenon used to restore normal operation, increase operating-power rating, and stabilize operating characteristics at lower powers. Higher operating power is tolerated after treatment.

  14. Chronic Obstructive Pulmonary Disease: From Injury to Genomic Stability.

    PubMed

    Sergio, Luiz Philippe da Silva; de Paoli, Flavia; Mencalha, Andre Luiz; da Fonseca, Adenilson de Souza

    2017-08-01

    Chronic obstructive pulmonary disease (COPD) is the fourth cause of death in the world and it is currently presenting a major global public health challenge, causing premature death from pathophysiological complications and rising economic and social burdens. COPD develops from a combination of factors following exposure to pollutants and cigarette smoke, presenting a combination of both emphysema and chronic obstructive bronchitis, which causes lung airflow limitations that are not fully reversible by bronchodilators. Oxidative stress plays a key role in the maintenance and amplification of inflammation in tissue injury, and also induces DNA damages. Once the DNA molecule is damaged, enzymatic mechanisms act in order to repair the DNA molecule. These mechanisms are specific to repair of oxidative damages, such as nitrogen base modifications, or larger DNA damages, such as double-strand breaks. In addition, there is an enzymatic mechanism for the control of telomere length. All these mechanisms contribute to cell viability and homeostasis. Thus, therapies based on modulation of DNA repair and genomic stability could be effective in improving repair and recovery of lung tissue in patients with COPD.

  15. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  16. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  17. Efficacy and predictability of soft tissue ablation using a prototype Raman-shifted alexandrite laser

    NASA Astrophysics Data System (ADS)

    Kozub, John A.; Shen, Jin-H.; Joos, Karen M.; Prasad, Ratna; Shane Hutson, M.

    2015-10-01

    Previous research showed that mid-infrared free-electron lasers could reproducibly ablate soft tissue with little collateral damage. The potential for surgical applications motivated searches for alternative tabletop lasers providing thermally confined pulses in the 6- to-7-μm wavelength range with sufficient pulse energy, stability, and reliability. Here, we evaluate a prototype Raman-shifted alexandrite laser. We measure ablation thresholds, etch rates, and collateral damage in gelatin and cornea as a function of laser wavelength (6.09, 6.27, or 6.43 μm), pulse energy (up to 3 mJ/pulse), and spot diameter (100 to 600 μm). We find modest wavelength dependence for ablation thresholds and collateral damage, with the lowest thresholds and least damage for 6.09 μm. We find a strong spot-size dependence for all metrics. When the beam is tightly focused (˜100-μm diameter), ablation requires more energy, is highly variable and less efficient, and can yield large zones of mechanical damage (for pulse energies >1 mJ). When the beam is softly focused (˜300-μm diameter), ablation proceeded at surgically relevant etch rates, with reasonable reproducibility (5% to 12% within a single sample), and little collateral damage. With improvements in pulse-energy stability, this prototype laser may have significant potential for soft-tissue surgical applications.

  18. Efficacy and predictability of soft tissue ablation using a prototype Raman-shifted alexandrite laser

    PubMed Central

    Kozub, John A.; Shen, Jin-H.; Joos, Karen M.; Prasad, Ratna; Shane Hutson, M.

    2015-01-01

    Abstract. Previous research showed that mid-infrared free-electron lasers could reproducibly ablate soft tissue with little collateral damage. The potential for surgical applications motivated searches for alternative tabletop lasers providing thermally confined pulses in the 6- to-7-μm wavelength range with sufficient pulse energy, stability, and reliability. Here, we evaluate a prototype Raman-shifted alexandrite laser. We measure ablation thresholds, etch rates, and collateral damage in gelatin and cornea as a function of laser wavelength (6.09, 6.27, or 6.43  μm), pulse energy (up to 3  mJ/pulse), and spot diameter (100 to 600  μm). We find modest wavelength dependence for ablation thresholds and collateral damage, with the lowest thresholds and least damage for 6.09  μm. We find a strong spot-size dependence for all metrics. When the beam is tightly focused (∼100-μm diameter), ablation requires more energy, is highly variable and less efficient, and can yield large zones of mechanical damage (for pulse energies >1  mJ). When the beam is softly focused (∼300-μm diameter), ablation proceeded at surgically relevant etch rates, with reasonable reproducibility (5% to 12% within a single sample), and little collateral damage. With improvements in pulse-energy stability, this prototype laser may have significant potential for soft-tissue surgical applications. PMID:26456553

  19. Loss‐of‐function mutation of rice SLAC7 decreases chloroplast stability and induces a photoprotection mechanism in rice

    PubMed Central

    Fan, Xiaolei; Wu, Jiemin; Chen, Taiyu; Tie, Weiwei; Chen, Hao; Zhou, Fei

    2015-01-01

    Abstract Plants absorb sunlight to power the photochemical reactions of photosynthesis, which can potentially damage the photosynthetic machinery. However, the mechanism that protects chloroplasts from the damage remains unclear. In this work, we demonstrated that rice (Oryza sativa L.) SLAC7 is a generally expressed membrane protein. Loss‐of‐function of SLAC7 caused continuous damage to the chloroplasts of mutant leaves under normal light conditions. Ion leakage indicators related to leaf damage such as H2O2 and abscisic acid levels were significantly higher in slac7‐1 than in the wild type. Consistently, the photosynthesis efficiency and Fv/Fm ratio of slac7‐1 were significantly decreased (similar to photoinhibition). In response to chloroplast damage, slac7‐1 altered its leaf morphology (curled or fused leaf) by the synergy between plant hormones and transcriptional factors to decrease the absorption of light, suggesting that a photoprotection mechanism for chloroplast damage was activated in slac7‐1. When grown in dark conditions, slac7‐1 displayed a normal phenotype. SLAC7 under the control of the AtSLAC1 promoter could partially complement the phenotypes of Arabidopsis slac1 mutants, indicating a partial conservation of SLAC protein functions. These results suggest that SLAC7 is essential for maintaining the chloroplast stability in rice. PMID:25739330

  20. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  1. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  2. Mutations in DONSON disrupt replication fork stability and cause microcephalic dwarfism

    PubMed Central

    Reynolds, John J; Bicknell, Louise S; Carroll, Paula; Higgs, Martin R; Shaheen, Ranad; Murray, Jennie E; Papadopoulos, Dimitrios K; Leitch, Andrea; Murina, Olga; Tarnauskaitė, Žygimantė; Wessel, Sarah R; Zlatanou, Anastasia; Vernet, Audrey; von Kriegsheim, Alex; Mottram, Rachel MA; Logan, Clare V; Bye, Hannah; Li, Yun; Brean, Alexander; Maddirevula, Sateesh; Challis, Rachel C; Skouloudaki, Kassiani; Almoisheer, Agaadir; Alsaif, Hessa S; Amar, Ariella; Prescott, Natalie J; Bober, Michael B; Duker, Angela; Faqeih, Eissa; Seidahmed, Mohammed Zain; Al Tala, Saeed; Alswaid, Abdulrahman; Ahmed, Saleem; Al-Aama, Jumana Yousuf; Altmüller, Janine; Al Balwi, Mohammed; Brady, Angela F; Chessa, Luciana; Cox, Helen; Fischetto, Rita; Heller, Raoul; Henderson, Bertram D; Hobson, Emma; Nürnberg, Peter; Percin, E Ferda; Peron, Angela; Spaccini, Luigina; Quigley, Alan J; Thakur, Seema; Wise, Carol A; Yoon, Grace; Alnemer, Maha; Tomancak, Pavel; Yigit, Gökhan; Taylor, A Malcolm R; Reijns, Martin AM; Simpson, Michael A; Cortez, David; Alkuraya, Fowzan S; Mathew, Christopher G; Jackson, Andrew P; Stewart, Grant S

    2017-01-01

    To ensure efficient genome duplication, cells have evolved numerous factors that promote unperturbed DNA replication, and protect, repair and restart damaged forks. Here we identify DONSON as a novel fork protection factor, and report biallelic DONSON mutations in 29 individuals with microcephalic dwarfism. We demonstrate that DONSON is a replisome component that stabilises forks during genome replication. Loss of DONSON leads to severe replication-associated DNA damage arising from nucleolytic cleavage of stalled replication forks. Furthermore, ATR-dependent signalling in response to replication stress is impaired in DONSON-deficient cells, resulting in decreased checkpoint activity, and potentiating chromosomal instability. Hypomorphic mutations substantially reduce DONSON protein levels and impair fork stability in patient cells, consistent with defective DNA replication underlying the disease phenotype. In summary, we identify mutations in DONSON as a common cause of microcephalic dwarfism, and establish DONSON as a critical replication fork protein required for mammalian DNA replication and genome stability. PMID:28191891

  3. Mechanical stability of the cell nucleus: roles played by the cytoskeleton in nuclear deformation and strain recovery.

    PubMed

    Wang, Xian; Liu, Haijiao; Zhu, Min; Cao, Changhong; Xu, Zhensong; Tsatskis, Yonit; Lau, Kimberly; Kuok, Chikin; Filleter, Tobin; McNeill, Helen; Simmons, Craig A; Hopyan, Sevan; Sun, Yu

    2018-05-18

    Extracellular forces transmitted through the cytoskeleton can deform the cell nucleus. Large nuclear deformation increases the risk of disrupting the nuclear envelope's integrity and causing DNA damage. Mechanical stability of the nucleus defines its capability of maintaining nuclear shape by minimizing nuclear deformation and recovering strain when deformed. Understanding the deformation and recovery behavior of the nucleus requires characterization of nuclear viscoelastic properties. Here, we quantified the decoupled viscoelastic parameters of the cell membrane, cytoskeleton, and the nucleus. The results indicate that the cytoskeleton enhances nuclear mechanical stability by lowering the effective deformability of the nucleus while maintaining nuclear sensitivity to mechanical stimuli. Additionally, the cytoskeleton decreases the strain energy release rate of the nucleus and might thus prevent shape change-induced structural damage to chromatin. © 2018. Published by The Company of Biologists Ltd.

  4. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  5. Stability of the cancer target DDIAS is regulated by the CHIP/HSP70 pathway in lung cancer cells

    PubMed Central

    Won, Kyoung-Jae; Im, Joo-Young; Kim, Bo-Kyung; Ban, Hyun Seung; Jung, Young-Jin; Jung, Kyeong Eun; Won, Misun

    2017-01-01

    DNA damage-induced apoptosis suppressor (DDIAS) rescues lung cancer cells from apoptosis in response to DNA damage. DDIAS is transcriptionally activated by NFATc1 and EGF-mediated ERK5/MEF2B, leading to cisplatin resistance and cell invasion. Therefore, DDIAS is suggested as a therapeutic target for lung cancer. Here, we report that DDIAS stability is regulated by E3 U-box ubiquitin ligase carboxyl terminus of HSP70-interacting protein (CHIP)-mediated proteasomal degradation. We first isolated CHIP as an interacting partner of DDIAS by yeast two-hybrid screening. CHIP physically associated with both the N- and C-terminal regions of DDIAS, targeting it for proteasomal degradation and reducing the DDIAS half-life. CHIP overexpression analyses indicated that the tetratrico peptide repeat (TPR) domain and the U-box are required for DDIAS ubiquitination. It is likely that HSP70-bound DDIAS is recruited to the CHIP E3 ligase via the TPR domain, suggesting DDIAS as a client protein of HSP70. In addition, CHIP overexpression in lung cancer cells expressing high DDIAS levels induced significant growth inhibition by enhancing DDIAS degradation. Furthermore, simultaneous CHIP overexpression and DNA damage agent treatment caused a substantial increase in the apoptosis of lung cancer cells. Taken together, these findings indicate that the stability of the DDIAS protein is regulated by CHIP/HSP70-mediated proteasomal degradation and that CHIP overexpression stimulates the apoptosis of lung cancer cells in response to DNA-damaging agents. PMID:28079882

  6. Stability of the cancer target DDIAS is regulated by the CHIP/HSP70 pathway in lung cancer cells.

    PubMed

    Won, Kyoung-Jae; Im, Joo-Young; Kim, Bo-Kyung; Ban, Hyun Seung; Jung, Young-Jin; Jung, Kyeong Eun; Won, Misun

    2017-01-12

    DNA damage-induced apoptosis suppressor (DDIAS) rescues lung cancer cells from apoptosis in response to DNA damage. DDIAS is transcriptionally activated by NFATc1 and EGF-mediated ERK5/MEF2B, leading to cisplatin resistance and cell invasion. Therefore, DDIAS is suggested as a therapeutic target for lung cancer. Here, we report that DDIAS stability is regulated by E3 U-box ubiquitin ligase carboxyl terminus of HSP70-interacting protein (CHIP)-mediated proteasomal degradation. We first isolated CHIP as an interacting partner of DDIAS by yeast two-hybrid screening. CHIP physically associated with both the N- and C-terminal regions of DDIAS, targeting it for proteasomal degradation and reducing the DDIAS half-life. CHIP overexpression analyses indicated that the tetratrico peptide repeat (TPR) domain and the U-box are required for DDIAS ubiquitination. It is likely that HSP70-bound DDIAS is recruited to the CHIP E3 ligase via the TPR domain, suggesting DDIAS as a client protein of HSP70. In addition, CHIP overexpression in lung cancer cells expressing high DDIAS levels induced significant growth inhibition by enhancing DDIAS degradation. Furthermore, simultaneous CHIP overexpression and DNA damage agent treatment caused a substantial increase in the apoptosis of lung cancer cells. Taken together, these findings indicate that the stability of the DDIAS protein is regulated by CHIP/HSP70-mediated proteasomal degradation and that CHIP overexpression stimulates the apoptosis of lung cancer cells in response to DNA-damaging agents.

  7. Un regard international sur la sécurité nucléaire

    NASA Astrophysics Data System (ADS)

    Birkhofer, Adolf

    2002-10-01

    Safety has always been an important objective in nuclear technology. Starting with a set of sound physical principles and prudent design approaches, safety concepts have gradually been refined and cover now a wide range of provisions related to design, quality and operation. Research, the evaluation of operating experiences and probabilistic risk assessments constitute an essential basis and international co-operation plays a significant role in that context. Concerning future developments a major objective for new reactor concepts, such as the EPR, is to practically exclude a severe core damage accident with large scale consequences outside the plant. To cite this article: A. Birkhofer, C. R. Physique 3 (2002) 1059-1065.

  8. Upgrading the fuel-handling machine of the Novovoronezh nuclear power plant unit no. 5

    NASA Astrophysics Data System (ADS)

    Terekhov, D. V.; Dunaev, V. I.

    2014-02-01

    The calculation of safety parameters was carried out in the process of upgrading the fuel-handling machine (FHM) of the Novovoronezh nuclear power plant (NPP) unit no. 5 based on the results of quantitative safety analysis of nuclear fuel transfer operations using a dynamic logical-and-probabilistic model of the processing procedure. Specific engineering and design concepts that made it possible to reduce the probability of damaging the fuel assemblies (FAs) when performing various technological operations by an order of magnitude and introduce more flexible algorithms into the modernized FHM control system were developed. The results of pilot operation during two refueling campaigns prove that the total reactor shutdown time is lowered.

  9. Space shuttle solid rocket booster recovery system definition. Volume 2: SRB water impact Monte Carlo computer program, user's manual

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The HD 220 program was created as part of the space shuttle solid rocket booster recovery system definition. The model was generated to investigate the damage to SRB components under water impact loads. The random nature of environmental parameters, such as ocean waves and wind conditions, necessitates estimation of the relative frequency of occurrence for these parameters. The nondeterministic nature of component strengths also lends itself to probabilistic simulation. The Monte Carlo technique allows the simultaneous perturbation of multiple independent parameters and provides outputs describing the probability distribution functions of the dependent parameters. This allows the user to determine the required statistics for each output parameter.

  10. [Tanning lamp radiation-induced photochemical retinal damage].

    PubMed

    Volkov, V V; Kharitonova, N N; Mal'tsev, D S

    2014-01-01

    On the basis of original clinical research a rare case of bilateral retinal damage due to tanning lamp radiation exposure is presented. Along with significant decrease of visual acuity and light sensitivity of central visual field as well as color vision impairment, bilateral macular dystrophy was found during an ophthalmoscopy and confirmed by optical coherent tomography and fluorescent angiography. Intensive retinoprotective, vascular, and antioxidant therapy was effective and led to functional improvement and stabilization of the pathologic process associated with photochemical retinal damage. A brief review of literature compares mechanisms of retinal damage by either short or long-wave near visible radiation.

  11. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  12. WHEN MODEL MEETS REALITY – A REVIEW OF SPAR LEVEL 2 MODEL AGAINST FUKUSHIMA ACCIDENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhegang Ma

    The Standardized Plant Analysis Risk (SPAR) models are a set of probabilistic risk assessment (PRA) models used by the Nuclear Regulatory Commission (NRC) to evaluate the risk of operations at U.S. nuclear power plants and provide inputs to risk informed regulatory process. A small number of SPAR Level 2 models have been developed mostly for feasibility study purpose. They extend the Level 1 models to include containment systems, group plant damage states, and model containment phenomenology and accident progression in containment event trees. A severe earthquake and tsunami hit the eastern coast of Japan in March 2011 and caused significantmore » damages on the reactors in Fukushima Daiichi site. Station blackout (SBO), core damage, containment damage, hydrogen explosion, and intensive radioactivity release, which have been previous analyzed and assumed as postulated accident progression in PRA models, now occurred with various degrees in the multi-units Fukushima Daiichi site. This paper reviews and compares a typical BWR SPAR Level 2 model with the “real” accident progressions and sequences occurred in Fukushima Daiichi Units 1, 2, and 3. It shows that the SPAR Level 2 model is a robust PRA model that could very reasonably describe the accident progression for a real and complicated nuclear accident in the world. On the other hand, the comparison shows that the SPAR model could be enhanced by incorporating some accident characteristics for better representation of severe accident progression.« less

  13. Isochronal annealing effects on local structure, crystalline fraction, and undamaged region size of radiation damage in Ga-stabilized δ-Pu

    DOE PAGES

    Olive, D. T.; Booth, C. H.; Wang, D. L.; ...

    2016-07-19

    The effects on the local structure due to self-irradiation damage of Ga stabilized δ-Pu stored at cryogenic temperatures have been examined using extended x-ray absorption fine structure (EXAFS) experiments. Extensive damage, seen as a loss of local order, was evident after 72 days of storage below 15 K. The effect was observed from both the Pu and the Ga sites, although less pronounced around Ga. Isochronal annealing was performed on this sample to study the annealing processes that occur between cryogenic and room temperature storage conditions, where damage is mostly reversed. Damage fractions at various points along the annealing curvemore » have been determined using an amplitude-ratio method, a standard EXAFS fitting, and a spherical crystallite model, and provide information complementary to the previous electrical resistivity- and susceptibility-based isochronal annealing studies. The use of a spherical crystallite model accounts for the changes in EXAFS spectra using just two parameters, namely, the crystalline fraction and the particle radius. Altogether, these results are discussed in terms of changes to the local structure around Ga and Pu throughout the annealing process and highlight the unusual role of Ga in the behavior of the lowest temperature anneals.« less

  14. Isochronal annealing effects on local structure, crystalline fraction, and undamaged region size of radiation damage in Ga-stabilized δ-Pu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olive, D. T.; Materials Science and Technology Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545; Wang, D. L.

    2016-07-21

    The effects on the local structure due to self-irradiation damage of Ga stabilized δ-Pu stored at cryogenic temperatures have been examined using extended x-ray absorption fine structure (EXAFS) experiments. Extensive damage, seen as a loss of local order, was evident after 72 days of storage below 15 K. The effect was observed from both the Pu and the Ga sites, although less pronounced around Ga. Isochronal annealing was performed on this sample to study the annealing processes that occur between cryogenic and room temperature storage conditions, where damage is mostly reversed. Damage fractions at various points along the annealing curve havemore » been determined using an amplitude-ratio method, a standard EXAFS fitting, and a spherical crystallite model, and provide information complementary to the previous electrical resistivity- and susceptibility-based isochronal annealing studies. The use of a spherical crystallite model accounts for the changes in EXAFS spectra using just two parameters, namely, the crystalline fraction and the particle radius. Together, these results are discussed in terms of changes to the local structure around Ga and Pu throughout the annealing process and highlight the unusual role of Ga in the behavior of the lowest temperature anneals.« less

  15. A-Type Lamins Maintain the Positional Stability of DNA Damage Repair Foci in Mammalian Nuclei

    PubMed Central

    Mahen, Robert; Hattori, Hiroyoshi; Lee, Miyoung; Sharma, Pooja; Jeyasekharan, Anand D.; Venkitaraman, Ashok R.

    2013-01-01

    A-type lamins encoded by LMNA form a structural fibrillar meshwork within the mammalian nucleus. How this nuclear organization may influence the execution of biological processes involving DNA transactions remains unclear. Here, we characterize changes in the dynamics and biochemical interactions of lamin A/C after DNA damage. We find that DNA breakage reduces the mobility of nucleoplasmic GFP-lamin A throughout the nucleus as measured by dynamic fluorescence imaging and spectroscopy in living cells, suggestive of incorporation into stable macromolecular complexes, but does not induce the focal accumulation of GFP-lamin A at damage sites. Using a proximity ligation assay and biochemical analyses, we show that lamin A engages chromatin via histone H2AX and its phosphorylated form (γH2AX) induced by DNA damage, and that these interactions are enhanced after DNA damage. Finally, we use three-dimensional time-lapse imaging to show that LMNA inactivation significantly reduces the positional stability of DNA repair foci in living cells. This defect is partially rescued by the stable expression of GFP-lamin A. Thus collectively, our findings suggest that the dynamic structural meshwork formed by A-type lamins anchors sites of DNA repair in mammalian nuclei, providing fresh insight into the control of DNA transactions by nuclear structural organization. PMID:23658700

  16. Probabilistic Feasibility of the Reconstruction Process of Russian-Orthodox Churches

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Brunn, A.; Stilla, U.

    2016-06-01

    The cultural human heritage is important for the identity of following generations and has to be preserved in a suitable manner. In the course of time a lot of information about former cultural constructions has been lost because some objects were strongly damaged by natural erosion or on account of human work or were even destroyed. It is important to capture still available building parts of former buildings, mostly ruins. This data could be the basis for a virtual reconstruction. Laserscanning offers in principle the possibility to take up extensively surfaces of buildings in its actual status. In this paper we assume a priori given 3d-laserscanner data, 3d point cloud for the partly destroyed church. There are many well known algorithms, that describe different methods of extraction and detection of geometric primitives, which are recognized separately in 3d points clouds. In our work we put them in a common probabilistic framework, which guides the complete reconstruction process of complex buildings, in our case russian-orthodox churches. Churches are modeled with their functional volumetric components, enriched with a priori known probabilities, which are deduced from a database of russian-orthodox churches. Each set of components represents a complete church. The power of the new method is shown for a simulated dataset of 100 russian-orthodox churches.

  17. Sundanese ancient manuscripts search engine using probability approach

    NASA Astrophysics Data System (ADS)

    Suryani, Mira; Hadi, Setiawan; Paulus, Erick; Nurma Yulita, Intan; Supriatna, Asep K.

    2017-10-01

    Today, Information and Communication Technology (ICT) has become a regular thing for every aspect of live include cultural and heritage aspect. Sundanese ancient manuscripts as Sundanese heritage are in damage condition and also the information that containing on it. So in order to preserve the information in Sundanese ancient manuscripts and make them easier to search, a search engine has been developed. The search engine must has good computing ability. In order to get the best computation in developed search engine, three types of probabilistic approaches: Bayesian Networks Model, Divergence from Randomness with PL2 distribution, and DFR-PL2F as derivative form DFR-PL2 have been compared in this study. The three probabilistic approaches supported by index of documents and three different weighting methods: term occurrence, term frequency, and TF-IDF. The experiment involved 12 Sundanese ancient manuscripts. From 12 manuscripts there are 474 distinct terms. The developed search engine tested by 50 random queries for three types of query. The experiment results showed that for the single query and multiple query, the best searching performance given by the combination of PL2F approach and TF-IDF weighting method. The performance has been evaluated using average time responds with value about 0.08 second and Mean Average Precision (MAP) about 0.33.

  18. A probabilistic method for determining the volume fraction of pre-embedded capsules in self-healing materials

    NASA Astrophysics Data System (ADS)

    Lv, Zhong; Chen, Huisu

    2014-10-01

    Autonomous healing of cracks using pre-embedded capsules containing healing agent is becoming a promising approach to restore the strength of damaged structures. In addition to the material properties, the size and volume fraction of capsules influence crack healing in the matrix. Understanding the crack and capsule interaction is critical in the development and design of structures made of self-healing materials. Assuming that the pre-embedded capsules are randomly dispersed we theoretically model flat ellipsoidal crack interaction with capsules and determine the probability of a crack intersecting the pre-embedded capsules i.e. the self-healing probability. We also develop a probabilistic model of a crack simultaneously meeting with capsules and catalyst carriers in two-component self-healing system matrix. Using a risk-based healing approach, we determine the volume fraction and size of the pre-embedded capsules that are required to achieve a certain self-healing probability. To understand the effect of the shape of the capsules on self-healing we theoretically modeled crack interaction with spherical and cylindrical capsules. We compared the results of our theoretical model with Monte-Carlo simulations of crack interaction with capsules. The formulae presented in this paper will provide guidelines for engineers working with self-healing structures in material selection and sustenance.

  19. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  20. MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft

    PubMed Central

    Zhang, Jing

    2015-01-01

    This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839

  1. Posterior stabilized versus cruciate retaining total knee arthroplasty designs: conformity affects the performance reliability of the design over the patient population.

    PubMed

    Ardestani, Marzieh M; Moazen, Mehran; Maniei, Ehsan; Jin, Zhongmin

    2015-04-01

    Commercially available fixed bearing knee prostheses are mainly divided into two groups: posterior stabilized (PS) versus cruciate retaining (CR). Despite the widespread comparative studies, the debate continues regarding the superiority of one type over the other. This study used a combined finite element (FE) simulation and principal component analysis (PCA) to evaluate "reliability" and "sensitivity" of two PS designs versus two CR designs over a patient population. Four fixed bearing implants were chosen: PFC (DePuy), PFC Sigma (DePuy), NexGen (Zimmer) and Genesis II (Smith & Nephew). Using PCA, a large probabilistic knee joint motion and loading database was generated based on the available experimental data from literature. The probabilistic knee joint data were applied to each implant in a FE simulation to calculate the potential envelopes of kinematics (i.e. anterior-posterior [AP] displacement and internal-external [IE] rotation) and contact mechanics. The performance envelopes were considered as an indicator of performance reliability. For each implant, PCA was used to highlight how much the implant performance was influenced by changes in each input parameter (sensitivity). Results showed that (1) conformity directly affected the reliability of the knee implant over a patient population such that lesser conformity designs (PS or CR), had higher kinematic variability and were more influenced by AP force and IE torque, (2) contact reliability did not differ noticeably among different designs and (3) CR or PS designs affected the relative rank of critical factors that influenced the reliability of each design. Such investigations enlighten the underlying biomechanics of various implant designs and can be utilized to estimate the potential performance of an implant design over a patient population. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  3. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  4. Probabilistic analysis for identifying the driving force of protein folding

    NASA Astrophysics Data System (ADS)

    Tokunaga, Yoshihiko; Yamamori, Yu; Matubayasi, Nobuyuki

    2018-03-01

    Toward identifying the driving force of protein folding, energetics was analyzed in water for Trp-cage (20 residues), protein G (56 residues), and ubiquitin (76 residues) at their native (folded) and heat-denatured (unfolded) states. All-atom molecular dynamics simulation was conducted, and the hydration effect was quantified by the solvation free energy. The free-energy calculation was done by employing the solution theory in the energy representation, and it was seen that the sum of the protein intramolecular (structural) energy and the solvation free energy is more favorable for a folded structure than for an unfolded one generated by heat. Probabilistic arguments were then developed to determine which of the electrostatic, van der Waals, and excluded-volume components of the interactions in the protein-water system governs the relative stabilities between the folded and unfolded structures. It was found that the electrostatic interaction does not correspond to the preference order of the two structures. The van der Waals and excluded-volume components were shown, on the other hand, to provide the right order of preference at probabilities of almost unity, and it is argued that a useful modeling of protein folding is possible on the basis of the excluded-volume effect.

  5. A National Security Strategy for A New Century.

    DTIC Science & Technology

    1997-05-01

    enhancing the prospects for political stability , peaceful conflict resolution and greater hope for the people of the world. At the same time, the dangers we...proliferation of weapons of mass destruction are global concerns that transcend national borders; and environmental damage and rapid population growth undermine economic prosperity and political stability in many countries.

  6. 46 CFR 175.400 - Definitions of terms used in this subchapter.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... designated in § 1.01 of this chapter.; Consideration means an economic benefit, inducement, right, or profit... stability criteria and means: (1) Waters, except the Great Lakes, more than 20 nautical miles from a harbor... bulkhead that must be maintained watertight in order for the vessel to meet the damage stability and...

  7. A surface acoustic wave ICP sensor with good temperature stability.

    PubMed

    Zhang, Bing; Hu, Hong; Ye, Aipeng; Zhang, Peng

    2017-07-20

    Intracranial pressure (ICP) monitoring is very important for assessing and monitoring hydrocephalus, head trauma and hypertension patients, which could lead to elevated ICP or even devastating neurological damage. The mortality rate due to these diseases could be reduced through ICP monitoring, because precautions can be taken against the brain damage. This paper presents a surface acoustic wave (SAW) pressure sensor to realize ICP monitoring, which is capable of wireless and passive transmission with antenna attached. In order to improve the temperature stability of the sensor, two methods were adopted. First, the ST cut quartz was chosen as the sensor substrate due to its good temperature stability. Then, a differential temperature compensation method was proposed to reduce the effects of temperature. Two resonators were designed based on coupling of mode (COM) theory and the prototype was fabricated and verified using a system established for testing pressure and temperature. The experiment result shows that the sensor has a linearity of 2.63% and hysteresis of 1.77%. The temperature stability of the sensor has been greatly improved by using the differential compensation method, which validates the effectiveness of the proposed method.

  8. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  9. E2F1 and E2F2 induction in response to DNA damage preserves genomic stability in neuronal cells.

    PubMed

    Castillo, Daniela S; Campalans, Anna; Belluscio, Laura M; Carcagno, Abel L; Radicella, J Pablo; Cánepa, Eduardo T; Pregi, Nicolás

    2015-01-01

    E2F transcription factors regulate a wide range of biological processes, including the cellular response to DNA damage. In the present study, we examined whether E2F family members are transcriptionally induced following treatment with several genotoxic agents, and have a role on the cell DNA damage response. We show a novel mechanism, conserved among diverse species, in which E2F1 and E2F2, the latter specifically in neuronal cells, are transcriptionally induced after DNA damage. This upregulation leads to increased E2F1 and E2F2 protein levels as a consequence of de novo protein synthesis. Ectopic expression of these E2Fs in neuronal cells reduces the level of DNA damage following genotoxic treatment, while ablation of E2F1 and E2F2 leads to the accumulation of DNA lesions and increased apoptotic response. Cell viability and DNA repair capability in response to DNA damage induction are also reduced by the E2F1 and E2F2 deficiencies. Finally, E2F1 and E2F2 accumulate at sites of oxidative and UV-induced DNA damage, and interact with γH2AX DNA repair factor. As previously reported for E2F1, E2F2 promotes Rad51 foci formation, interacts with GCN5 acetyltransferase and induces histone acetylation following genotoxic insult. The results presented here unveil a new mechanism involving E2F1 and E2F2 in the maintenance of genomic stability in response to DNA damage in neuronal cells.

  10. E2F1 and E2F2 induction in response to DNA damage preserves genomic stability in neuronal cells

    PubMed Central

    Castillo, Daniela S; Campalans, Anna; Belluscio, Laura M; Carcagno, Abel L; Radicella, J Pablo; Cánepa, Eduardo T; Pregi, Nicolás

    2015-01-01

    E2F transcription factors regulate a wide range of biological processes, including the cellular response to DNA damage. In the present study, we examined whether E2F family members are transcriptionally induced following treatment with several genotoxic agents, and have a role on the cell DNA damage response. We show a novel mechanism, conserved among diverse species, in which E2F1 and E2F2, the latter specifically in neuronal cells, are transcriptionally induced after DNA damage. This upregulation leads to increased E2F1 and E2F2 protein levels as a consequence of de novo protein synthesis. Ectopic expression of these E2Fs in neuronal cells reduces the level of DNA damage following genotoxic treatment, while ablation of E2F1 and E2F2 leads to the accumulation of DNA lesions and increased apoptotic response. Cell viability and DNA repair capability in response to DNA damage induction are also reduced by the E2F1 and E2F2 deficiencies. Finally, E2F1 and E2F2 accumulate at sites of oxidative and UV-induced DNA damage, and interact with γH2AX DNA repair factor. As previously reported for E2F1, E2F2 promotes Rad51 foci formation, interacts with GCN5 acetyltransferase and induces histone acetylation following genotoxic insult. The results presented here unveil a new mechanism involving E2F1 and E2F2 in the maintenance of genomic stability in response to DNA damage in neuronal cells. PMID:25892555

  11. Alteration/deficiency in activation-3 (Ada3) plays a critical role in maintaining genomic stability

    PubMed Central

    Mirza, Sameer; Katafiasz, Bryan J.; Kumar, Rakesh; Wang, Jun; Mohibi, Shakur; Jain, Smrati; Gurumurthy, Channabasavaiah Basavaraju; Pandita, Tej K.; Dave, Bhavana J.; Band, Hamid; Band, Vimla

    2012-01-01

    Cell cycle regulation and DNA repair following damage are essential for maintaining genome integrity. DNA damage activates checkpoints in order to repair damaged DNA prior to exit to the next phase of cell cycle. Recently, we have shown the role of Ada3, a component of various histone acetyltransferase complexes, in cell cycle regulation, and loss of Ada3 results in mouse embryonic lethality. Here, we used adenovirus-Cre-mediated Ada3 deletion in Ada3fl/fl mouse embryonic fibroblasts (MEFs) to assess the role of Ada3 in DNA damage response following exposure to ionizing radiation (IR). We report that Ada3 depletion was associated with increased levels of phospho-ATM (pATM), γH2AX, phospho-53BP1 (p53BP1) and phospho-RAD51 (pRAD51) in untreated cells; however, radiation response was intact in Ada3−/− cells. Notably, Ada3−/− cells exhibited a significant delay in disappearance of DNA damage foci for several critical proteins involved in the DNA repair process. Significantly, loss of Ada3 led to enhanced chromosomal aberrations, such as chromosome breaks, fragments, deletions and translocations, which further increased upon DNA damage. Notably, the total numbers of aberrations were more clearly observed in S-phase, as compared with G₁ or G₂ phases of cell cycle with IR. Lastly, comparison of DNA damage in Ada3fl/fl and Ada3−/− cells confirmed higher residual DNA damage in Ada3−/− cells, underscoring a critical role of Ada3 in the DNA repair process. Taken together, these findings provide evidence for a novel role for Ada3 in maintenance of the DNA repair process and genomic stability. PMID:23095635

  12. The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Prinn, R. G.

    2013-12-01

    The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that we need to evaluate policies based on their ability to lower risk, and to re-evaluate decisions over time as new knowledge is gained. Reference: R. G. Prinn, Development and Application of Earth System Models, Proceedings, National Academy of Science, June 15, 2012, http://www.pnas.org/cgi/doi/10.1073/pnas.1107470109.

  13. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  14. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  15. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  16. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.

  17. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  18. ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms

    NASA Astrophysics Data System (ADS)

    Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.

    2006-12-01

    Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.

  19. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    USGS Publications Warehouse

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  20. Probabilistic and Evolutionary Early Warning System: concepts, performances, and case-studies

    NASA Astrophysics Data System (ADS)

    Zollo, A.; Emolo, A.; Colombelli, S.; Elia, L.; Festa, G.; Martino, C.; Picozzi, M.

    2013-12-01

    PRESTo (PRobabilistic and Evolutionary early warning SysTem) is a software platform for Earthquake Early Warning that integrates algorithms for real-time earthquake location, magnitude estimation and damage assessment into a highly configurable and easily portable package. In its regional configuration, the software processes, in real-time, the 3-component acceleration data streams coming from seismic stations, for P-waves arrival detection and, in the case a quite large event is occurring, can promptly performs event detection and location, magnitude estimation and peak ground-motion prediction at target sites. The regional approach has been integrated with a threshold-based early warning method that allows, in the very first seconds after a moderate-to-large earthquake, to identify the most Probable Damaged Zone starting from the real-time measurement at near-source stations located at increasing distances from the earthquake epicenter, of the peak displacement (Pd) and predominant period of P-waves (τc), over a few-second long window after the P-wave arrival. Thus, each recording site independently provides an evolutionary alert level, according to the Pd and τc it measured, through a decisional table. Since 2009, PRESTo has been under continuous real-time testing using data streaming from the Iripinia Seismic Network (Southern Italy) and has produced a bulletin of some hundreds low magnitude events, including all the M≥2.5 earthquakes occurred in that period in Irpinia. Recently, PRESTo has been also implemented at the accelerometric network and broad-band networks in South Korea and in Romania, and off-line tested in Iberian Peninsula, in Turkey, in Israel, and in Japan. The feasibility of an Early Warning System at national scale, is currently under testing by studying the performances of the PRESTo platform for the Italian Accelerometric Network. Moreover, PRESTo is under experimentation in order to provide alert in a high-school located in the neighborhood of Naples at about 100 km from the Irpinia region.

  1. Probabilistic evaluation of the water footprint of a river basin: Accounting method and case study in the Segura River Basin, Spain.

    PubMed

    Pellicer-Martínez, Francisco; Martínez-Paz, José Miguel

    2018-06-15

    In the current study a method for the probabilistic accounting of the water footprint (WF) at the river basin level has been proposed and developed. It is based upon the simulation of the anthropised water cycle and combines a hydrological model and a decision support system. The methodology was carried out in the Segura River Basin (SRB) in South-eastern Spain, and four historical scenarios were evaluated (1998-2010-2015-2027). The results indicate that the WF of the river basin reached 5581 Mm 3 /year on average in the base scenario, with a high variability. The green component (3231 Mm 3 /year), mainly generated by rainfed crops (62%), was responsible for the great variability of the WF. The blue WF (1201 Mm 3 /year) was broken down into surface water (56%), renewable groundwater (20%) and non-renewable groundwater (24%), and it showed the generalized overexploitation of aquifers. Regarding the grey component (1150 Mm 3 /year), the study reveals that wastewater, especially phosphates (90%), was the main culprit producing water pollution in surface water bodies. The temporal evolution of the four scenarios highlighted the successfulness of the water treatment plans developed in the river basin, with a sharp decrease in the grey WF, as well as the stability of the WF and its three components in the future. So, the accounting of the three components of the WF in a basin was integrated into the management of water resources, it being possible to predict their evolution, their spatial characterisation and even their assessment in probabilistic terms. Then, the WF was incorporated into the set of indicators that usually is used in water resources management and hydrological planning. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Iterative Most-Likely Point Registration (IMLP): A Robust Algorithm for Computing Optimal Shape Alignment

    PubMed Central

    Billings, Seth D.; Boctor, Emad M.; Taylor, Russell H.

    2015-01-01

    We present a probabilistic registration algorithm that robustly solves the problem of rigid-body alignment between two shapes with high accuracy, by aptly modeling measurement noise in each shape, whether isotropic or anisotropic. For point-cloud shapes, the probabilistic framework additionally enables modeling locally-linear surface regions in the vicinity of each point to further improve registration accuracy. The proposed Iterative Most-Likely Point (IMLP) algorithm is formed as a variant of the popular Iterative Closest Point (ICP) algorithm, which iterates between point-correspondence and point-registration steps. IMLP’s probabilistic framework is used to incorporate a generalized noise model into both the correspondence and the registration phases of the algorithm, hence its name as a most-likely point method rather than a closest-point method. To efficiently compute the most-likely correspondences, we devise a novel search strategy based on a principal direction (PD)-tree search. We also propose a new approach to solve the generalized total-least-squares (GTLS) sub-problem of the registration phase, wherein the point correspondences are registered under a generalized noise model. Our GTLS approach has improved accuracy, efficiency, and stability compared to prior methods presented for this problem and offers a straightforward implementation using standard least squares. We evaluate the performance of IMLP relative to a large number of prior algorithms including ICP, a robust variant on ICP, Generalized ICP (GICP), and Coherent Point Drift (CPD), as well as drawing close comparison with the prior anisotropic registration methods of GTLS-ICP and A-ICP. The performance of IMLP is shown to be superior with respect to these algorithms over a wide range of noise conditions, outliers, and misalignments using both mesh and point-cloud representations of various shapes. PMID:25748700

  3. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  4. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  5. Probabilistic analysis and fatigue damage assessment of offshore mooring system due to non-Gaussian bimodal tension processes

    NASA Astrophysics Data System (ADS)

    Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng

    2017-08-01

    Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.

  6. Super DNAging-New insights into DNA integrity, genome stability and telomeres in the oldest old.

    PubMed

    Franzke, Bernhard; Neubauer, Oliver; Wagner, Karl-Heinz

    2015-01-01

    Reductions in DNA integrity, genome stability, and telomere length are strongly associated with the aging process, age-related diseases as well as the age-related loss of muscle mass. However, in people reaching an age far beyond their statistical life expectancy the prevalence of diseases, such as cancer, cardiovascular disease, diabetes or dementia, is much lower compared to "averagely" aged humans. These inverse observations in nonagenarians (90-99 years), centenarians (100-109 years) and super-centenarians (110 years and older) require a closer look into dynamics underlying DNA damage within the oldest old of our society. Available data indicate improved DNA repair and antioxidant defense mechanisms in "super old" humans, which are comparable with much younger cohorts. Partly as a result of these enhanced endogenous repair and protective mechanisms, the oldest old humans appear to cope better with risk factors for DNA damage over their lifetime compared to subjects whose lifespan coincides with the statistical life expectancy. This model is supported by study results demonstrating superior chromosomal stability, telomere dynamics and DNA integrity in "successful agers". There is also compelling evidence suggesting that life-style related factors including regular physical activity, a well-balanced diet and minimized psycho-social stress can reduce DNA damage and improve chromosomal stability. The most conclusive picture that emerges from reviewing the literature is that reaching "super old" age appears to be primarily determined by hereditary/genetic factors, while a healthy lifestyle additionally contributes to achieving the individual maximum lifespan in humans. More research is required in this rapidly growing population of super old people. In particular, there is need for more comprehensive investigations including short- and long-term lifestyle interventions as well as investigations focusing on the mechanisms causing DNA damage, mutations, and telomere shortening. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. 46 CFR 171.080 - Damage stability standards for vessels with Type I or Type II subdivision.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... following conditions in the final stage of flooding: (1) On a vessel required to survive assumed damage with... in the final stage of flooding and to meet the conditions set forth in paragraphs (f) (8) and (9) of this section in each intermediate stage of flooding. For the purposes of establishing boundaries to...

  8. 46 CFR 171.080 - Damage stability standards for vessels with Type I or Type II subdivision.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... following conditions in the final stage of flooding: (1) On a vessel required to survive assumed damage with... in the final stage of flooding and to meet the conditions set forth in paragraphs (f) (8) and (9) of this section in each intermediate stage of flooding. For the purposes of establishing boundaries to...

  9. 46 CFR 171.080 - Damage stability standards for vessels with Type I or Type II subdivision.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... following conditions in the final stage of flooding: (1) On a vessel required to survive assumed damage with... in the final stage of flooding and to meet the conditions set forth in paragraphs (f) (8) and (9) of this section in each intermediate stage of flooding. For the purposes of establishing boundaries to...

  10. DNA-PKcs, ATM, and ATR Interplay Maintains Genome Integrity during Neurogenesis.

    PubMed

    Enriquez-Rios, Vanessa; Dumitrache, Lavinia C; Downing, Susanna M; Li, Yang; Brown, Eric J; Russell, Helen R; McKinnon, Peter J

    2017-01-25

    The DNA damage response (DDR) orchestrates a network of cellular processes that integrates cell-cycle control and DNA repair or apoptosis, which serves to maintain genome stability. DNA-PKcs (the catalytic subunit of the DNA-dependent kinase, encoded by PRKDC), ATM (ataxia telangiectasia, mutated), and ATR (ATM and Rad3-related) are related PI3K-like protein kinases and central regulators of the DDR. Defects in these kinases have been linked to neurodegenerative or neurodevelopmental syndromes. In all cases, the key neuroprotective function of these kinases is uncertain. It also remains unclear how interactions between the three DNA damage-responsive kinases coordinate genome stability, particularly in a physiological context. Here, we used a genetic approach to identify the neural function of DNA-PKcs and the interplay between ATM and ATR during neurogenesis. We found that DNA-PKcs loss in the mouse sensitized neuronal progenitors to apoptosis after ionizing radiation because of excessive DNA damage. DNA-PKcs was also required to prevent endogenous DNA damage accumulation throughout the adult brain. In contrast, ATR coordinated the DDR during neurogenesis to direct apoptosis in cycling neural progenitors, whereas ATM regulated apoptosis in both proliferative and noncycling cells. We also found that ATR controls a DNA damage-induced G 2 /M checkpoint in cortical progenitors, independent of ATM and DNA-PKcs. These nonoverlapping roles were further confirmed via sustained murine embryonic or cortical development after all three kinases were simultaneously inactivated. Thus, our results illustrate how DNA-PKcs, ATM, and ATR have unique and essential roles during the DDR, collectively ensuring comprehensive genome maintenance in the nervous system. The DNA damage response (DDR) is essential for prevention of a broad spectrum of different human neurologic diseases. However, a detailed understanding of the DDR at a physiological level is lacking. In contrast to many in vitro cellular studies, here we demonstrate independent biological roles for the DDR kinases DNA-PKcs, ATM, and ATR during neurogenesis. We show that DNA-PKcs is central to DNA repair in nonproliferating cells, and restricts DNA damage accumulation, whereas ATR controls damage-induced G 2 checkpoint control and apoptosis in proliferating cells. Conversely, ATM is critical for controlling apoptosis in immature noncycling neural cells after DNA damage. These data demonstrate functionally distinct, but cooperative, roles for each kinase in preserving genome stability in the nervous system. Copyright © 2017 the authors 0270-6474/17/370893-13$15.00/0.

  11. A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’

    PubMed Central

    2017-01-01

    ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362

  12. Disclosure of the oscillations in kinetics of the reactor pressure vessel steel damage at fast neutron intensity decreasing

    NASA Astrophysics Data System (ADS)

    Krasikov, E.; Nikolaenko, V.

    2017-01-01

    Fast neutron intensity influence on reactor materials radiation damage is a critically important question in the problem of the correct use of the accelerated irradiation tests data for substantiation of the materials workability in real irradiation conditions that is low neutron intensity. Investigations of the fast neutron intensity (flux) influence on radiation damage and experimental data scattering reveal the existence of non-monotonous sections in kinetics of the reactor pressure vessels (RPV) steel damage. Discovery of the oscillations as indicator of the self-organization processes presence give reasons for new ways searching on reactor pressure vessel (RPV) steel radiation stability increasing and attempt of the self-restoring metal elaboration. Revealing of the wavelike process in the form of non monotonous parts of the kinetics of radiation embrittlement testifies that periodic transformation of the structure take place. This fact actualizes the problem of more precise definition of the RPV materials radiation embrittlement mechanisms and gives reasons for search of the ways to manage the radiation stability (nanostructuring and so on to stimulate the radiation defects annihilation), development of the means for creating of more stableness self recovering smart materials.

  13. Evaluation of DNA damage and cytotoxicity of polyurethane-based nano- and microparticles as promising biomaterials for drug delivery systems

    NASA Astrophysics Data System (ADS)

    Caon, Thiago; Zanetti-Ramos, Betina Giehl; Lemos-Senna, Elenara; Cloutet, Eric; Cramail, Henri; Borsali, Redouane; Soldi, Valdir; Simões, Cláudia Maria Oliveira

    2010-06-01

    The in vitro cytotoxicity and DNA damage evaluation of biodegradable polyurethane-based micro- and nanoparticles were carried out on animal fibroblasts. For cytotoxicity measurement and primary DNA damage evaluation, MTT and Comet assays were used, respectively. Different formulations were tested to evaluate the influence of chemical composition and physicochemical characteristics of particles on cell toxicity. No inhibition of cells growth surrounding the polyurethane particles was observed. On the other hand, a decrease of cell viability was verified when the anionic surfactant sodium dodecyl sulfate (SDS) was used as droplets stabilizer of monomeric phase. Polyurethane nanoparticles stabilized with Tween 80 and Pluronic F68 caused minor cytotoxic effects. These results indicated that the surface charge plays an important role on cytotoxicity. Particles synthesized from MDI displayed a higher cytotoxicity than those synthesized from IPDI. Size and physicochemical properties of the particles may explain the higher degree of DNA damage produced by two tested formulations. In this way, a rational choice of particles' constituents based on their cytotoxicity and genotoxicity could be very useful for conceiving biomaterials to be used as drug delivering systems.

  14. Structural Damage Detection Using Virtual Passive Controllers

    NASA Technical Reports Server (NTRS)

    Lew, Jiann-Shiun; Juang, Jer-Nan

    2001-01-01

    This paper presents novel approaches for structural damage detection which uses the virtual passive controllers attached to structures, where passive controllers are energy dissipative devices and thus guarantee the closed-loop stability. The use of the identified parameters of various closed-loop systems can solve the problem that reliable identified parameters, such as natural frequencies of the open-loop system may not provide enough information for damage detection. Only a small number of sensors are required for the proposed approaches. The identified natural frequencies, which are generally much less sensitive to noise and more reliable than the identified natural frequencies, are used for damage detection. Two damage detection techniques are presented. One technique is based on the structures with direct output feedback controllers while the other technique uses the second-order dynamic feedback controllers. A least-squares technique, which is based on the sensitivity of natural frequencies to damage variables, is used for accurately identifying the damage variables.

  15. Repair of DNA Damage Induced by the Cytidine Analog Zebularine Requires ATR and ATM in Arabidopsis[OPEN

    PubMed Central

    Liu, Chun-Hsin; Finke, Andreas; Díaz, Mariana; Rozhon, Wilfried; Poppenberger, Brigitte; Baubec, Tuncay; Pecinka, Ales

    2015-01-01

    DNA damage repair is an essential cellular mechanism that maintains genome stability. Here, we show that the nonmethylable cytidine analog zebularine induces a DNA damage response in Arabidopsis thaliana, independent of changes in DNA methylation. In contrast to genotoxic agents that induce damage in a cell cycle stage-independent manner, zebularine induces damage specifically during strand synthesis in DNA replication. The signaling of this damage is mediated by additive activity of ATAXIA TELANGIECTASIA MUTATED AND RAD3-RELATED and ATAXIA TELANGIECTASIA MUTATED kinases, which cause postreplicative cell cycle arrest and increased endoreplication. The repair requires a functional STRUCTURAL MAINTENANCE OF CHROMOSOMES5 (SMC5)-SMC6 complex and is accomplished predominantly by synthesis-dependent strand-annealing homologous recombination. Here, we provide insight into the response mechanism for coping with the genotoxic effects of zebularine and identify several components of the zebularine-induced DNA damage repair pathway. PMID:26023162

  16. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  17. Duplex Interrogation by a Direct DNA Repair Protein in Search of Base Damage

    PubMed Central

    Yi, Chengqi; Chen, Baoen; Qi, Bo; Zhang, Wen; Jia, Guifang; Zhang, Liang; Li, Charles J.; Dinner, Aaron R.; Yang, Cai-Guang; He, Chuan

    2012-01-01

    ALKBH2 is a direct DNA repair dioxygenase guarding mammalian genome against N1-methyladenine, N3-methylcytosine, and 1,N6-ethenoadenine damage. A prerequisite for repair is to identify these lesions in the genome. Here we present crystal structures of ALKBH2 bound to different duplex DNAs. Together with computational and biochemical analyses, our results suggest that DNA interrogation by ALKBH2 displays two novel features: i) ALKBH2 probes base-pair stability and detects base pairs with reduced stability; ii) ALKBH2 does not have nor need a “damage-checking site”, which is critical for preventing spurious base-cleavage for several glycosylases. The demethylation mechanism of ALKBH2 insures that only cognate lesions are oxidized and reversed to normal bases, and that a flipped, non-substrate base remains intact in the active site. Overall, the combination of duplex interrogation and oxidation chemistry allows ALKBH2 to detect and process diverse lesions efficiently and correctly. PMID:22659876

  18. UV-radiation Induced Disruption of Dry-Cavities in Human γD-crystallin Results in Decreased Stability and Faster Unfolding

    PubMed Central

    Xia, Zhen; Yang, Zaixing; Huynh, Tien; King, Jonathan A.; Zhou, Ruhong

    2013-01-01

    Age-onset cataracts are believed to be expedited by the accumulation of UV-damaged human γD-crystallins in the eye lens. Here we show with molecular dynamics simulations that the stability of γD-crystallin is greatly reduced by the conversion of tryptophan to kynurenine due to UV-radiation, consistent with previous experimental evidences. Furthermore, our atomic-detailed results reveal that kynurenine attracts more waters and other polar sidechains due to its additional amino and carbonyl groups on the damaged tryptophan sidechain, thus breaching the integrity of nearby dry center regions formed by the two Greek key motifs in each domain. The damaged tryptophan residues cause large fluctuations in the Tyr-Trp-Tyr sandwich-like hydrophobic clusters, which in turn break crucial hydrogen-bonds bridging two β-strands in the Greek key motifs at the “tyrosine corner”. Our findings may provide new insights for understanding of the molecular mechanism of the initial stages of UV-induced cataractogenesis. PMID:23532089

  19. The Aurora-B-dependent NoCut checkpoint prevents damage of anaphase bridges after DNA replication stress.

    PubMed

    Amaral, Nuno; Vendrell, Alexandre; Funaya, Charlotta; Idrissi, Fatima-Zahra; Maier, Michael; Kumar, Arun; Neurohr, Gabriel; Colomina, Neus; Torres-Rosell, Jordi; Geli, María-Isabel; Mendoza, Manuel

    2016-05-01

    Anaphase chromatin bridges can lead to chromosome breakage if not properly resolved before completion of cytokinesis. The NoCut checkpoint, which depends on Aurora B at the spindle midzone, delays abscission in response to chromosome segregation defects in yeast and animal cells. How chromatin bridges are detected, and whether abscission inhibition prevents their damage, remain key unresolved questions. We find that bridges induced by DNA replication stress and by condensation or decatenation defects, but not dicentric chromosomes, delay abscission in a NoCut-dependent manner. Decatenation and condensation defects lead to spindle stabilization during cytokinesis, allowing bridge detection by Aurora B. NoCut does not prevent DNA damage following condensin or topoisomerase II inactivation; however, it protects anaphase bridges and promotes cellular viability after replication stress. Therefore, the molecular origin of chromatin bridges is critical for activation of NoCut, which plays a key role in the maintenance of genome stability after replicative stress.

  20. Multifunctional Role of ATM/Tel1 Kinase in Genome Stability: From the DNA Damage Response to Telomere Maintenance

    PubMed Central

    2014-01-01

    The mammalian protein kinase ataxia telangiectasia mutated (ATM) is a key regulator of the DNA double-strand-break response and belongs to the evolutionary conserved phosphatidylinositol-3-kinase-related protein kinases. ATM deficiency causes ataxia telangiectasia (AT), a genetic disorder that is characterized by premature aging, cerebellar neuropathy, immunodeficiency, and predisposition to cancer. AT cells show defects in the DNA damage-response pathway, cell-cycle control, and telomere maintenance and length regulation. Likewise, in Saccharomyces cerevisiae, haploid strains defective in the TEL1 gene, the ATM ortholog, show chromosomal aberrations and short telomeres. In this review, we outline the complex role of ATM/Tel1 in maintaining genomic stability through its control of numerous aspects of cellular survival. In particular, we describe how ATM/Tel1 participates in the signal transduction pathways elicited by DNA damage and in telomere homeostasis and its importance as a barrier to cancer development. PMID:25247188

  1. Diagnostic of Gravitropism-like Stabilizer of Inspection Drone Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Kruglova, Tatyana; Sayfeddine, Daher; Bulgakov, Alexey

    2018-03-01

    This paper discusses the enhancement of flight stability of using an inspection drone to scan the condition of buildings on low and high altitude. Due to aerial perturbations and wakes, the drone starts to shake and may be damaged. One of the mechanical optimization methods it so add a built-in stabilizing mechanism. However, the performance of this supporting device becomes critical on certain flying heights, thus to avoid losing the drone. The paper is divided in two parts: the description of the gravitropism-like stabilizer and the diagnostic of its status using wavelet transformation and neural network classification.

  2. Rad53 regulates replication fork restart after DNA damage in Saccharomyces cerevisiae

    PubMed Central

    Szyjka, Shawn J.; Aparicio, Jennifer G.; Viggiani, Christopher J.; Knott, Simon; Xu, Weihong; Tavaré, Simon; Aparicio, Oscar M.

    2008-01-01

    Replication fork stalling at a DNA lesion generates a damage signal that activates the Rad53 kinase, which plays a vital role in survival by stabilizing stalled replication forks. However, evidence that Rad53 directly modulates the activity of replication forks has been lacking, and the nature of fork stabilization has remained unclear. Recently, cells lacking the Psy2–Pph3 phosphatase were shown to be defective in dephosphorylation of Rad53 as well as replication fork restart after DNA damage, suggesting a mechanistic link between Rad53 deactivation and fork restart. To test this possibility we examined the progression of replication forks in methyl-methanesulfonate (MMS)-damaged cells, under different conditions of Rad53 activity. Hyperactivity of Rad53 in pph3Δ cells slows fork progression in MMS, whereas deactivation of Rad53, through expression of dominant-negative Rad53-KD, is sufficient to allow fork restart during recovery. Furthermore, combined deletion of PPH3 and PTC2, a second, unrelated Rad53 phosphatase, results in complete replication fork arrest and lethality in MMS, demonstrating that Rad53 deactivation is a key mechanism controlling fork restart. We propose a model for regulation of replication fork progression through damaged DNA involving a cycle of Rad53 activation and deactivation that coordinates replication restart with DNA repair. PMID:18628397

  3. Ubiquitylation and the Fanconi Anemia Pathway

    PubMed Central

    Garner, Elizabeth; Smogorzewska, Agata

    2012-01-01

    The Fanconi anemia (FA) pathway maintains genome stability through co-ordination of DNA repair of interstrand crosslinks (ICLs). Disruption of the FA pathway yields hypersensitivity to interstrand crosslinking agents, bone marrow failure and cancer predisposition. Early steps in DNA damage dependent activation of the pathway are governed by monoubiquitylation of FANCD2 and FANCI by the intrinsic FA E3 ubiquitin ligase, FANCL. Downstream FA pathway components and associated factors such as FAN1 and SLX4 exhibit ubiquitin-binding motifs that are important for their DNA repair function, underscoring the importance of ubiquitylation in FA pathway mediated repair. Importantly, ubiquitylation provides the foundations for cross-talk between repair pathways, which in concert with the FA pathway, resolve interstrand crosslink damage and maintain genomic stability. PMID:21605559

  4. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA determines the overall risk associated with a particular mission by factoring in all known risks to the spacecraft during its mission. The threat to mission and human life posed by the micro-meteoroid and orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the ISS. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. In this paper, we present possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the Shuttle and Station versions of BUMPER II.

  5. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  6. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  7. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  8. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  9. A decade of understanding spatio-temporal regulation of DNA repair by the nuclear architecture.

    PubMed

    Saad, Hicham; Cobb, Jennifer A

    2016-10-01

    The nucleus is a hub for gene expression and is a highly organized entity. The nucleoplasm is heterogeneous, owing to the preferential localization of specific metabolic factors, which lead to the definition of nuclear compartments or bodies. The genome is organized into chromosome territories, as well as heterochromatin and euchromatin domains. Recent observations have indicated that nuclear organization is important for maintaining genomic stability. For example, nuclear organization has been implicated in stabilizing damaged DNA, repair-pathway choice, and in preventing chromosomal rearrangements. Over the past decade, several studies have revealed that dynamic changes in the nuclear architecture are important during double-strand break repair. Stemming from work in yeast, relocation of a damaged site prior to repair appears to be at least partially conserved in multicellular eukaryotes. In this review, we will discuss genome and nucleoplasm architecture, particularly the importance of the nuclear periphery in genome stability. We will also discuss how the site of relocation regulates repair-pathway choice.

  10. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  11. Influence of damage and basal friction on the grounding line dynamics

    NASA Astrophysics Data System (ADS)

    Brondex, Julien; Gagliardini, Olivier; Gillet-Chaulet, Fabien; Durand, Gael

    2016-04-01

    The understanding of grounding line dynamics is a major issue in the prediction of future sea level rise due to ice released from polar ice sheets into the ocean. This dynamics is complex and significantly affected by several physical processes not always adequately accounted for in current ice flow models. Among those processes, our study focuses on ice damage and evolving basal friction conditions. Softening of the ice due to damaging processes is known to have a strong impact on its rheology by reducing its viscosity and therefore promoting flow acceleration. Damage creates where shear stresses are high enough which is usually the case at shear margins and in the vicinity of pinning points in contact with ice-shelves. Those areas are known to have a buttressing effect on ice shelves contributing to stabilize the grounding line. We aim at evaluating the extent to which this stabilizing effect is hampered by damaging processes. Several friction laws have been proposed by various author to model the contact between grounded-ice and bedrock. Among them, Coulomb-type friction laws enable to account for reduced friction related to low effective pressure (the ice pressure minus the water pressure). Combining such a friction law to a parametrization of the effective pressure accounting for the fact that the area upstream the grounded line is connected to the ocean, is expected to have a significant impact on the grounding line dynamics. Using the finite-element code Elmer/Ice within which both the Coulomb-type friction law, the effective pressure parametrization and the damage model have been implemented, the goal of this study is to investigate the sensitivity of the grounding line dynamics to damage and to an evolving basal friction. The relative importance between those two processes on the grounding line dynamics is addressed as well.

  12. p53, a New Master Regulator of Stem Cell Differentiation | Center for Cancer Research

    Cancer.gov

    When the genome is damaged, a key player in stabilizing and maintaining genomic integrity is a protein called p53.  This protein can activate or shut down gene activity in response to DNA damage.  But how exactly does p53 accomplish its task? This question has yet to be answered completely at the molecular level.   

  13. La Recherche Aerospatiale, Bimonthly Bulletin, no. 1982-6, 211/November-Decemter 1982

    NASA Astrophysics Data System (ADS)

    Sevestre, C.

    1983-04-01

    A modular method for centrifugal compressor performance prediction is presented. Cyclic hardening of stainless steel under complex loading is described. Fatigue failure microinitiation, micropropagation and damage is considered. The stability of a tilting rotor aircraft model is studied. The thermal stability of titanium alloys is investigated. A compensator for thermal effects on quartz oscillators is described.

  14. 30 CFR 250.1003 - Installation, testing, and repair requirements for DOI pipelines.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... installed in water depths of less than 200 feet shall be buried to a depth of at least 3 feet unless they... damage potential exists. (b)(1) Pipelines shall be pressure tested with water at a stabilized pressure of... repair, the pipeline shall be pressure tested with water or processed natural gas at a minimum stabilized...

  15. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  16. Structural response of Nd-stabilized zirconia and its composite under extreme conditions of swift heavy ion irradiation

    NASA Astrophysics Data System (ADS)

    Nandi, Chiranjit; Grover, V.; Kulriya, P. K.; Poswal, A. K.; Prakash, Amrit; Khan, K. B.; Avasthi, D. K.; Tyagi, A. K.

    2018-02-01

    Inert matrix fuel concept for minor actinide transmutation proposes stabilized zirconia as the major component for inert matrix. The present study explores Nd-stabilized zirconia (Zr0.8Nd0.2O1.9; Nd as surrogate for Am) and its composites for radiation tolerance against fission fragments. The introduction of MgO in the composite with stabilised zirconia is performed from the point of view to enhance the thermal conductivity. The radiation damage is also compared with Nd-stabilized zirconia co-doped with Y3+ (Zr0.8Nd0.1Y0.1O1.9) in order to mimic doping of minor actinides in Y3+ containing stabilized zirconia (Nd as surrogate for Am). The compositions were synthesized by gel combustion followed by high temperature sintering and characterised by XRD, SEM and EDS. Irradiation was carried out by 120 MeV Au ions at various fluences and irradiation induced structural changes were probed by in-situ X-ray diffraction (XRD). XRD demonstrated the retention of crystallinity for all the three samples but the extent of the damage was found to be highly dependent on the nominal composition. It was observed that introduction of Y3+ along with Nd3+ to stabilize cubic zirconia imparted poorer radiation stability. On the other hand, formation of a CERCER composite of MgO with Nd-stabilised zirconia enhanced its behaviour against swift heavy ion irradiation. Investigating these compositions by XANES spectroscopy post irradiation did not show any change in local electronic structure of constituent ions.

  17. Use of Savitzky-Golay Filter for Performances Improvement of SHM Systems Based on Neural Networks and Distributed PZT Sensors.

    PubMed

    de Oliveira, Mario A; Araujo, Nelcileno V S; da Silva, Rodolfo N; da Silva, Tony I; Epaarachchi, Jayantha

    2018-01-08

    A considerable amount of research has focused on monitoring structural damage using Structural Health Monitoring (SHM) technologies, which has had recent advances. However, it is important to note the challenges and unresolved problems that disqualify currently developed monitoring systems. One of the frontline SHM technologies, the Electromechanical Impedance (EMI) technique, has shown its potential to overcome remaining problems and challenges. Unfortunately, the recently developed neural network algorithms have not shown significant improvements in the accuracy of rate and the required processing time. In order to fill this gap in advanced neural networks used with EMI techniques, this paper proposes an enhanced and reliable strategy for improving the structural damage detection via: (1) Savitzky-Golay (SG) filter, using both first and second derivatives; (2) Probabilistic Neural Network (PNN); and, (3) Simplified Fuzzy ARTMAP Network (SFAN). Those three methods were employed to analyze the EMI data experimentally obtained from an aluminum plate containing three attached PZT (Lead Zirconate Titanate) patches. In this present study, the damage scenarios were simulated by attaching a small metallic nut at three different positions in the aluminum plate. We found that the proposed method achieves a hit rate of more than 83%, which is significantly higher than current state-of-the-art approaches. Furthermore, this approach results in an improvement of 93% when considering the best case scenario.

  18. Use of Savitzky–Golay Filter for Performances Improvement of SHM Systems Based on Neural Networks and Distributed PZT Sensors

    PubMed Central

    Araujo, Nelcileno V. S.; da Silva, Rodolfo N.; da Silva, Tony I.; Epaarachchi, Jayantha

    2018-01-01

    A considerable amount of research has focused on monitoring structural damage using Structural Health Monitoring (SHM) technologies, which has had recent advances. However, it is important to note the challenges and unresolved problems that disqualify currently developed monitoring systems. One of the frontline SHM technologies, the Electromechanical Impedance (EMI) technique, has shown its potential to overcome remaining problems and challenges. Unfortunately, the recently developed neural network algorithms have not shown significant improvements in the accuracy of rate and the required processing time. In order to fill this gap in advanced neural networks used with EMI techniques, this paper proposes an enhanced and reliable strategy for improving the structural damage detection via: (1) Savitzky–Golay (SG) filter, using both first and second derivatives; (2) Probabilistic Neural Network (PNN); and, (3) Simplified Fuzzy ARTMAP Network (SFAN). Those three methods were employed to analyze the EMI data experimentally obtained from an aluminum plate containing three attached PZT (Lead Zirconate Titanate) patches. In this present study, the damage scenarios were simulated by attaching a small metallic nut at three different positions in the aluminum plate. We found that the proposed method achieves a hit rate of more than 83%, which is significantly higher than current state-of-the-art approaches. Furthermore, this approach results in an improvement of 93% when considering the best case scenario. PMID:29316693

  19. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the modelmore » to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.« less

  20. Self-Healing Characteristics of Damaged Rock Salt under Different Healing Conditions

    PubMed Central

    Chen, Jie; Ren, Song; Yang, Chunhe; Jiang, Deyi; Li, Lin

    2013-01-01

    Salt deposits are commonly regarded as ideal hosts for geologic energy reservoirs. Underground cavern construction-induced damage in salt is reduced by self-healing. Thus, studying the influencing factors on such healing processes is important. This research uses ultrasonic technology to monitor the longitudinal wave velocity variations of stress-damaged rock salts during self-recovery experiments under different recovery conditions. The influences of stress-induced initial damage, temperature, humidity, and oil on the self-recovery of damaged rock salts are analyzed. The wave velocity values of the damaged rock salts increase rapidly during the first 200 h of recovery, and the values gradually increase toward stabilization after 600 h. The recovery of damaged rock salts is subjected to higher initial damage stress. Water is important in damage recovery. The increase in temperature improves damage recovery when water is abundant, but hinders recovery when water evaporates. The presence of residual hydraulic oil blocks the inter-granular role of water and restrains the recovery under triaxial compression. The results indicate that rock salt damage recovery is related to the damage degree, pore pressure, temperature, humidity, and presence of oil due to the sealing integrity of the jacket material. PMID:28811444

  1. Self-Healing Characteristics of Damaged Rock Salt under Different Healing Conditions.

    PubMed

    Chen, Jie; Ren, Song; Yang, Chunhe; Jiang, Deyi; Li, Lin

    2013-08-12

    Salt deposits are commonly regarded as ideal hosts for geologic energy reservoirs. Underground cavern construction-induced damage in salt is reduced by self-healing. Thus, studying the influencing factors on such healing processes is important. This research uses ultrasonic technology to monitor the longitudinal wave velocity variations of stress-damaged rock salts during self-recovery experiments under different recovery conditions. The influences of stress-induced initial damage, temperature, humidity, and oil on the self-recovery of damaged rock salts are analyzed. The wave velocity values of the damaged rock salts increase rapidly during the first 200 h of recovery, and the values gradually increase toward stabilization after 600 h. The recovery of damaged rock salts is subjected to higher initial damage stress. Water is important in damage recovery. The increase in temperature improves damage recovery when water is abundant, but hinders recovery when water evaporates. The presence of residual hydraulic oil blocks the inter-granular role of water and restrains the recovery under triaxial compression. The results indicate that rock salt damage recovery is related to the damage degree, pore pressure, temperature, humidity, and presence of oil due to the sealing integrity of the jacket material.

  2. Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System

    DTIC Science & Technology

    2014-06-01

    in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are

  3. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  4. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.

  5. Probabilistic seismic hazard assessment of the Eastern and Central groups of the Azores - Portugal

    NASA Astrophysics Data System (ADS)

    Fontiela, João; Bezzeghoud, Mourad; Rosset, Philippe; Borges, José; Rodrigues, Francisco; Caldeira, Bento

    2017-04-01

    Azores islands of the Eastern and Central groups are located at the triple junction of the American, Eurasian and Nubian plates inducing a large number of low magnitude earthquakes. Since its settlement in the 15th century, 33 earthquakes with intensity ≥ VII have caused severe damage and high death toll. The most severe ones occurred in 1522 at São Miguel Island with a maximum MM intensity of X; in 1614 at Terceira Island (X) in 1757 at São Jorge Island (XI); 1852 at São Miguel Island (VIII); 1926 at Faial Island (Mb 5.3-5.9); in 1980 at Terceira Island (Mw7.1) and in 1998 at Faial Island (Mw6.2). The analysis of the Probabilistic Seismic Hazard Assessment (PSHA) were carried out using the classical Cornell-McGuire approach using seismogenic zones recently defined by Fontiela et al. (2014). We create a new earthquake catalogue merging local and global datasets with a large time span (1522 - 2016) to calculate recurrence times and maximum magnitudes. In order to reduce the epistemic uncertainties, we test several ground motion prediction equations in agreement with the geological heterogeneities typical of young volcanic islands. Probabilistic seismic hazard maps are proposed for 475 and 975 years returns periods as well as hazard curves and uniform hazard spectra for the main cities. REFERENCES: Fontiela, J. et al., 2014. Azores seismogenic zones. Comunicações Geológicas, 101(1), pp.351-354. ACKNOWLEDGMENTS: João Fontiela is supported by grant M3.1.2/F/060/2011 of Regional Science Fund of the Regional Government Azores and this study is co-funded by the European Union through the European fund of Regional Development, framed in COMPETE 2020 (Operational Competitiveness Programme and Internationalization) through the ICT project (UID/GEO/04683/2013) with the reference POCI-01-0145-FEDER-007690.

  6. Improved reliability of wind turbine towers with active tuned mass dampers (ATMDs)

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Breiffni; Sarkar, Saptarshi; Staino, Andrea

    2018-04-01

    Modern multi-megawatt wind turbines are composed of slender, flexible, and lightly damped blades and towers. These components exhibit high susceptibility to wind-induced vibrations. As the size, flexibility and cost of the towers have increased in recent years, the need to protect these structures against damage induced by turbulent aerodynamic loading has become apparent. This paper combines structural dynamic models and probabilistic assessment tools to demonstrate improvements in structural reliability when modern wind turbine towers are equipped with active tuned mass dampers (ATMDs). This study proposes a multi-modal wind turbine model for wind turbine control design and analysis. This study incorporates an ATMD into the tower of this model. The model is subjected to stochastically generated wind loads of varying speeds to develop wind-induced probabilistic demand models for towers of modern multi-megawatt wind turbines under structural uncertainty. Numerical simulations have been carried out to ascertain the effectiveness of the active control system to improve the structural performance of the wind turbine and its reliability. The study constructs fragility curves, which illustrate reductions in the vulnerability of towers to wind loading owing to the inclusion of the damper. Results show that the active controller is successful in increasing the reliability of the tower responses. According to the analysis carried out in this paper, a strong reduction of the probability of exceeding a given displacement at the rated wind speed has been observed.

  7. Design and evaluation guidelines for Department of Energy facilities subjected to natural phenomena hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, R.P.; Short, S.A.; McDonald, J.R.

    1990-06-01

    The Department of Energy (DOE) and the DOE Natural Phenomena Hazards Panel have developed uniform design and evaluation guidelines for protection against natural phenomena hazards at DOE sites throughout the United States. The goal of the guidelines is to assure that DOE facilities can withstand the effects of natural phenomena such as earthquakes, extreme winds, tornadoes, and flooding. The guidelines apply to both new facilities (design) and existing facilities (evaluation, modification, and upgrading). The intended audience is primarily the civil/structural or mechanical engineers conducting the design or evaluation of DOE facilities. The likelihood of occurrence of natural phenomena hazards atmore » each DOE site has been evaluated by the DOE Natural Phenomena Hazard Program. Probabilistic hazard models are available for earthquake, extreme wind/tornado, and flood. Alternatively, site organizations are encouraged to develop site-specific hazard models utilizing the most recent information and techniques available. In this document, performance goals and natural hazard levels are expressed in probabilistic terms, and design and evaluation procedures are presented in deterministic terms. Design/evaluation procedures conform closely to common standard practices so that the procedures will be easily understood by most engineers. Performance goals are expressed in terms of structure or equipment damage to the extent that: (1) the facility cannot function; (2) the facility would need to be replaced; or (3) personnel are endangered. 82 refs., 12 figs., 18 tabs.« less

  8. E2F1 transcription is induced by genotoxic stress through ATM/ATR activation.

    PubMed

    Carcagno, Abel L; Ogara, María F; Sonzogni, Silvina V; Marazita, Mariela C; Sirkin, Pablo F; Ceruti, Julieta M; Cánepa, Eduardo T

    2009-05-01

    E2F1, a member of the E2F family of transcription factors, plays a critical role in controlling both cell cycle progression and apoptotic cell death in response to DNA damage and oncogene activation. Following genotoxic stresses, E2F1 protein is stabilized by phosphorylation and acetylation driven to its accumulation. The aim of the present work was to examine whether the increase in E2F1 protein levels observed after DNA damage is only a reflection of an increase in E2F1 protein stability or is also the consequence of enhanced transcription of the E2F1 gene. The data presented here demonstrates that UV light and other genotoxics induce the transcription of E2F1 gene in an ATM/ATR dependent manner, which results in increasing E2F1 mRNA and protein levels. After genotoxic stress, transcription of cyclin E, an E2F1 target gene, was significantly induced. This induction was the result of two well-differentiated effects, one of them dependent on de novo protein synthesis and the other on the protein stabilization. Our results strongly support a transcriptional effect of DNA damaging agents on E2F1 expression. The results presented herein uncover a new mechanism involving E2F1 in response to genotoxic stress.

  9. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  10. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  11. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  12. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  13. Adjustment of Adaptive Gain with Bounded Linear Stability Analysis to Improve Time-Delay Margin for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje Srinvas

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a linear damaged twin-engine generic transport model of aircraft. The analysis shows that the system with the adjusted adaptive gain becomes more robust to unmodeled dynamics or time delay.

  14. The formation of catalytically competent enzyme-substrate complex is not a bottleneck in lesion excision by human alkyladenine DNA glycosylase.

    PubMed

    Kuznetsov, N A; Kiryutin, A S; Kuznetsova, A A; Panov, M S; Barsukova, M O; Yurkovskaya, A V; Fedorova, O S

    2017-04-01

    Human alkyladenine DNA glycosylase (AAG) protects DNA from alkylated and deaminated purine lesions. AAG flips out the damaged nucleotide from the double helix of DNA and catalyzes the hydrolysis of the N-glycosidic bond to release the damaged base. To understand better, how the step of nucleotide eversion influences the overall catalytic process, we performed a pre-steady-state kinetic analysis of AAG interaction with specific DNA-substrates, 13-base pair duplexes containing in the 7th position 1-N6-ethenoadenine (εA), hypoxanthine (Hx), and the stable product analogue tetrahydrofuran (F). The combination of the fluorescence of tryptophan, 2-aminopurine, and 1-N6-ethenoadenine was used to record conformational changes of the enzyme and DNA during the processes of DNA lesion recognition, damaged base eversion, excision of the N-glycosidic bond, and product release. The thermal stability of the duplexes characterized by the temperature of melting, T m , and the rates of spontaneous opening of individual nucleotide base pairs were determined by NMR spectroscopy. The data show that the relative thermal stability of duplexes containing a particular base pair in position 7, (T m (F/T) < T m (εA/T) < T m (Hx/T) < T m (A/T)) correlates with the rate of reversible spontaneous opening of the base pair. However, in contrast to that, the catalytic lesion excision rate is two orders of magnitude higher for Hx-containing substrates than for substrates containing εA, proving that catalytic activity is not correlated with the stability of the damaged base pair. Our study reveals that the formation of the catalytically competent enzyme-substrate complex is not the bottleneck controlling the catalytic activity of AAG.

  15. Role of temperature in the radiation stability of yttria stabilized zirconia under swift heavy ion irradiation: A study from the perspective of nuclear reactor applications

    NASA Astrophysics Data System (ADS)

    Kalita, Parswajit; Ghosh, Santanu; Sattonnay, Gaël; Singh, Udai B.; Grover, Vinita; Shukla, Rakesh; Amirthapandian, S.; Meena, Ramcharan; Tyagi, A. K.; Avasthi, Devesh K.

    2017-07-01

    The search for materials that can withstand the harsh radiation environments of the nuclear industry has become an urgent challenge in the face of ever-increasing demands for nuclear energy. To this end, polycrystalline yttria stabilized zirconia (YSZ) pellets were irradiated with 80 MeV Ag6+ ions to investigate their radiation tolerance against fission fragments. To better simulate a nuclear reactor environment, the irradiations were carried out at the typical nuclear reactor temperature (850 °C). For comparison, irradiations were also performed at room temperature. Grazing incidence X-ray diffraction and Raman spectroscopy measurements reveal degradation in crystallinity for the room temperature irradiated samples. No bulk structural amorphization was however observed, whereas defect clusters were formed as indicated by transmission electron microscopy and supported by thermal spike simulation results. A significant reduction of the irradiation induced defects/damage, i.e., improvement in the radiation tolerance, was seen under irradiation at 850 °C. This is attributed to the fact that the rapid thermal quenching of the localized hot molten zones (arising from spike in the lattice temperature upon irradiation) is confined to 850 °C (i.e., attributed to the resistance inflicted on the rapid thermal quenching of the localized hot molten zones by the high temperature of the environment) thereby resulting in the reduction of the defects/damage produced. Our results present strong evidence for the applicability of YSZ as an inert matrix fuel in nuclear reactors, where competitive effects of radiation damage and dynamic thermal healing mechanisms may lead to a strong reduction in the damage production and thus sustain its physical integrity.

  16. Tripeptidyl peptidase II plays a role in the radiation response of selected primary cell types but not based on nuclear translocation and p53 stabilization.

    PubMed

    Firat, Elke; Tsurumi, Chizuko; Gaedicke, Simone; Huai, Jisen; Niedermann, Gabriele

    2009-04-15

    The giant cytosolic protease tripeptidyl peptidase II (TPPII) was recently proposed to play a role in the DNA damage response. Shown were nuclear translocation of TPPII after gamma-irradiation, lack of radiation-induced p53 stabilization in TPPII-siRNA-treated cells, and complete tumor regression in mice after gamma-irradiation when combined with TPPII-siRNA silencing or a protease inhibitor reported to inhibit TPPII. This suggested that TPPII could be a novel target for tumor radiosensitization and prompted us to study radiation responses using TPPII-knockout mice. Neither the sensitivity to total body irradiation nor the radiosensitivity of resting lymphoid cells, which both strongly depend on p53, was altered in the absence of TPPII. Functional integrity of p53 in TPPII-knockout cells is further shown by a proper G(1) arrest and by the accumulation of p53 and its transcriptional targets, p21, Bax, and Fas, on gamma-irradiation. Furthermore, we could not confirm radiation-induced nuclear translocation of TPPII. Nevertheless, after gamma-irradiation, we found slightly increased mitotic catastrophe of TPPII-deficient primary fibroblasts and increased apoptosis of TPPII-deficient activated CD8(+) T cells. The latter was accompanied by delayed resolution of the DNA double-strand break marker gammaH2AX. This could, however, be due to increased apoptotic DNA damage rather than reduced DNA damage repair. Our data do not confirm a role for TPPII in the DNA damage response based on nuclear TPPII translocation and p53 stabilization but nevertheless do show increased radiation-induced cell death of selected nontransformed cell types in the absence of the TPPII protease.

  17. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  18. Correlation of Risk Analysis Method Results with Numerical and Limit Equilibrium Methods in Overall Slope Stability Analysis of Southern Wall of Chadormalu Iron Open Pit Mine-Iran / Korelacja wyników analizy ryzyka z wynikami obliczeń numerycznych oraz wynikami uzyskanymi w oparciu o metodę równowagi granicznej zastosowanych do badania stabilności wyrobiska pochyłego na południowej ścianie odkrywkowej kopalni rud żelaza w chadormalu w Iranie

    NASA Astrophysics Data System (ADS)

    Ahangari, Kaveh; Paji, Arman Gholinezhad; Behdani, Alireza Siami

    2013-06-01

    Slope stability analysis is one of the most important factors in designing open pit mines. Therefore an optimal slope design that supports both aspects of economy and safety is very significant. There are many different methods in slope stability analysis including empirical, limit equilibrium, block theory, numerical, and probabilistic methods. In this study, to analyze the overall slope stability of southern wall of Chadormalu iron open pit mine three numerical, limit equilibrium and probabilistic methods have been used. Software and methods that is used for analytical investigation in this study are FLAC software for numerical analysis, SLIDE software and circuit failure chart for limit equilibrium analysis and qualitative fault tree and semi-quantitative risk matrix for probabilistic analysis. The results of all above mentioned methods, was a circular failure occurrence in Metasomatite rock zone between 1405 to 1525 m levels. The main factors of failure occurrence in this range were heavily jointing and existing of faults. Safety factors resulted from numerical method; Circular chart method and SLIDE software are 1.16, 1.25 and 1.27 respectively. Regarding instability and safety factors in Metasomatite rock zone, in order to stabilize the given zone, some considerations such as bench angle and height reduction should be planned. In results of risk matrix method this zone was mentioned too as a high risk zone that numerical and limit equilibrium methods confirmed this. Badanie stabilności wyrobiska pochyłego jest jednym z najważniejszych czynników uwzględnianych przy projektowaniu kopalni odkrywkowych. Optymalne zaprojektowanie wyrobiska pochyłego z uwzględnieniem czynników ekonomicznych oraz bezpieczeństwa jest niezmiernie ważne. Istnieje wiele metod badania stabilności wyrobiska pochyłego, między innymi metody empiryczne, metoda równowagi granicznej, teoria bloków oraz metody numeryczne i probabilistyczne. W pracy tej omówiono zastosowanie trzech spośród tych metod: metody numerycznej, równowagi granicznej oraz metody probabilistycznej, do analizy stabilności wyrobiska pochyłego na południowej ścianie kopalni rud żelaza w Chadormalu w Iranie. Oprogramowanie wykorzystane w badaniach analitycznych to pakiet FLAK przy metodzie numerycznej, oprogramowanie SLIDE oraz wykresy kołowe przy metodzie równowagi granicznej oraz jakościowe drzewa określające występowanie uskoków i pół-jakościowe macierze ryzyka przy metodzie probabilistycznej. Wyniki uzyskane w oparciu o trzy wyżej wymienione metody wykazały wystąpienie zawalenia się skał metasomatycznych na poziomie od 1405 do 1525 m. Głównymi czynnikami warunkującymi zawalenie się skał w tym regionie była obecność licznych pęknięć oraz uskoków. Wskaźniki bezpieczeństwa uzyskane przy pomocy metod numerycznych, wykresu kołowego oraz oprogramowanie SLIDE wyniosły kolejno: 1.16, 1.25, 1.27. W odniesieniu do niestabilności w rejonie skał metasomatycznych, aby uczynić tę strefę bardziej stabilną należy uwzględnić takie czynniki jak kąt nachylenia ławy oraz obniżenie wysokości. Analiza przeprowadzona w oparciu o macierze ryzyka wykazała, że strefa ta jest strefą wysokiego ryzyka, zaś wyniki analizy numerycznej oraz wyników uzyskanych przy pomocy metody równowagi granicznej w pełni ten wniosek potwierdziły.

  19. Maintaining Genome Stability: The Role of Helicases and Deaminases

    DTIC Science & Technology

    2008-07-01

    deaminases. The cells response to different forms of damage is fundamental to its ability to repair itself when challenged by environmental or...Page 4 of 12 INTRODUCTION Genomic DNA stores all the information for living organisms. The faithful duplication the maintanance of DNA are...maturation of the immune system by modifying enzymes called deaminases. The cells response to different forms of damage is fundamental to its ability to

  20. Atomic Oxygen Effects on POSS Polyimides

    DTIC Science & Technology

    2011-07-25

    resistance to UV damage, and excellent thermal properties.1 Despite the desirable properties of Kapton, this polyimide and all organic polymeric materials...stability, insulation properties, IR transparency, low solar absorptance, resistance to UV damage, and excellent thermal properties.1 Despite the...8 × 1021 atoms cm-2. Free standing films of MC-POSS polyimide were sewn to a Kapton blanket and exposed to a sweeping ram in LEO on MISSE-5

  1. Epigenomic maintenance through dietary intervention can facilitate DNA repair process to slow down the progress of premature aging.

    PubMed

    Ghosh, Shampa; Sinha, Jitendra Kumar; Raghunath, Manchala

    2016-09-01

    DNA damage caused by various sources remains one of the most researched topics in the area of aging and neurodegeneration. Increased DNA damage causes premature aging. Aging is plastic and is characterised by the decline in the ability of a cell/organism to maintain genomic stability. Lifespan can be modulated by various interventions like calorie restriction, a balanced diet of macro and micronutrients or supplementation with nutrients/nutrient formulations such as Amalaki rasayana, docosahexaenoic acid, resveratrol, curcumin, etc. Increased levels of DNA damage in the form of double stranded and single stranded breaks are associated with decreased longevity in animal models like WNIN/Ob obese rats. Erroneous DNA repair can result in accumulation of DNA damage products, which in turn result in premature aging disorders such as Hutchinson-Gilford progeria syndrome. Epigenomic studies of the aging process have opened a completely new arena for research and development of drugs and therapeutic agents. We propose here that agents or interventions that can maintain epigenomic stability and facilitate the DNA repair process can slow down the progress of premature aging, if not completely prevent it. © 2016 IUBMB Life, 68(9):717-721, 2016. © 2016 International Union of Biochemistry and Molecular Biology.

  2. NASA space materials research

    NASA Technical Reports Server (NTRS)

    Tenney, D. R.; Tompkins, S. S.; Sykes, G. F.

    1985-01-01

    The effect of the space environment on: (1) thermal control coatings and thin polymer films; (2) radiation stability of 250 F and 350 F cured graphite/epoxy composites; and (3) the thermal mechanical stability of graphite/epoxy, graphite/glass composites are considered. Degradation in mechanical properties due to combined radiation and thermal cycling is highlighted. Damage mechanisms are presented and chemistry modifications to improve stability are suggested. The dimensional instabilities in graphite/epoxy composites associated with microcracking during thermal cycling is examined as well as the thermal strain hysteresis found in metal-matrix composites.

  3. Advanced Booster Liquid Engine Combustion Stability

    NASA Technical Reports Server (NTRS)

    Tucker, Kevin; Gentz, Steve; Nettles, Mindy

    2015-01-01

    Combustion instability is a phenomenon in liquid rocket engines caused by complex coupling between the time-varying combustion processes and the fluid dynamics in the combustor. Consequences of the large pressure oscillations associated with combustion instability often cause significant hardware damage and can be catastrophic. The current combustion stability assessment tools are limited by the level of empiricism in many inputs and embedded models. This limited predictive capability creates significant uncertainty in stability assessments. This large uncertainty then increases hardware development costs due to heavy reliance on expensive and time-consuming testing.

  4. A flexible and qualitatively stable model for cell cycle dynamics including DNA damage effects.

    PubMed

    Jeffries, Clark D; Johnson, Charles R; Zhou, Tong; Simpson, Dennis A; Kaufmann, William K

    2012-01-01

    This paper includes a conceptual framework for cell cycle modeling into which the experimenter can map observed data and evaluate mechanisms of cell cycle control. The basic model exhibits qualitative stability, meaning that regardless of magnitudes of system parameters its instances are guaranteed to be stable in the sense that all feasible trajectories converge to a certain trajectory. Qualitative stability can also be described by the signs of real parts of eigenvalues of the system matrix. On the biological side, the resulting model can be tuned to approximate experimental data pertaining to human fibroblast cell lines treated with ionizing radiation, with or without disabled DNA damage checkpoints. Together these properties validate a fundamental, first order systems view of cell dynamics. Classification Codes: 15A68.

  5. Learning Sparse Feature Representations using Probabilistic Quadtrees and Deep Belief Nets

    DTIC Science & Technology

    2015-04-24

    Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Learning sparse feature representations is a useful instru- ment for solving an...novel framework for the classifi cation of handwritten digits that learns sparse representations using probabilistic quadtrees and Deep Belief Nets... Learning Sparse Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Report Title Learning sparse feature representations is a useful

  6. A Coupled Thermal–Hydrological–Mechanical Damage Model and Its Numerical Simulations of Damage Evolution in APSE

    PubMed Central

    Wei, Chenhui; Zhu, Wancheng; Chen, Shikuo; Ranjith, Pathegama Gamage

    2016-01-01

    This paper proposes a coupled thermal–hydrological–mechanical damage (THMD) model for the failure process of rock, in which coupling effects such as thermally induced rock deformation, water flow-induced thermal convection, and rock deformation-induced water flow are considered. The damage is considered to be the key factor that controls the THM coupling process and the heterogeneity of rock is characterized by the Weibull distribution. Next, numerical simulations on excavation-induced damage zones in Äspö pillar stability experiments (APSE) are carried out and the impact of in situ stress conditions on damage zone distribution is analysed. Then, further numerical simulations of damage evolution at the heating stage in APSE are carried out. The impacts of in situ stress state, swelling pressure and water pressure on damage evolution at the heating stage are simulated and analysed, respectively. The simulation results indicate that (1) the v-shaped notch at the sidewall of the pillar is predominantly controlled by the in situ stress trends and magnitude; (2) at the heating stage, the existence of confining pressure can suppress the occurrence of damage, including shear damage and tensile damage; and (3) the presence of water flow and water pressure can promote the occurrence of damage, especially shear damage. PMID:28774001

  7. Thermal stability of electron-irradiated poly(tetrafluoroethylene) - X-ray photoelectron and mass spectroscopic study

    NASA Technical Reports Server (NTRS)

    Wheeler, Donald R.; Pepper, Stephen V.

    1990-01-01

    Polytetrafluoroethylene (PTFE) was subjected to 3 keV electron bombardment and then heated in vacuum to 300 C. The behavior of the material as a function of radiation dose and temperature was studied by X-ray photoelectron spectroscopy (XPS) of the surface and mass spectroscopy of the species evolved. Lightly damaged material heated to 300 C evolved saturated fluorocarbon species, whereas unsaturated fluorocarbon species were evolved from heavily damaged material. After heating the heavily damaged material, those features in the XPS spectrum that were associated with damage diminished, giving the appearance that the radiation damage had annealed. The observations were interpreted by incorporating mass transport of severed chain fragments and thermal decomposition of severely damaged material into the branched and cross-linked network model of irradiated PTFE. The apparent annealing of the radiation damage was due to covering of the network by saturated fragments that easily diffused through the decomposed material to the surface region upon heating.

  8. X-ray photoelectron and mass spectroscopic study of electron irradiation and thermal stability of polytetrafluoroethylene

    NASA Technical Reports Server (NTRS)

    Wheeler, Donald R.; Pepper, Stephen V.

    1990-01-01

    Polytetrafluoroethylene (PTFE) was subjected to 3 keV electron bombardment and then heated in vacuum to 300 C. The behavior of the material as a function of radiation dose and temperature was studied by X-ray photoelectron spectroscopy (XPS) of the surface and mass spectroscopy of the species evolved. A quantitative comparison of the radiation dose rate with that in other reported studies showed that, for a given total dose, the damage observed by XPS is greater for higher dose rates. Lightly damaged material heated to 300 C evolved saturated fluorocarbon species, whereas unsaturated fluorocarbon species evolved from heavily damaged material. After heating the heavily damaged material, those features in the XPS that were associated with damage diminished, giving the appearance that the radiation damage annealed. The apparent annealing of the radiation damage was found to be due to the covering of the network by saturated fragments that easily diffused through the decomposed material to the surface region upon heating.

  9. Transcription and DNA Damage: Holding Hands or Crossing Swords?

    PubMed

    D'Alessandro, Giuseppina; d'Adda di Fagagna, Fabrizio

    2017-10-27

    Transcription has classically been considered a potential threat to genome integrity. Collision between transcription and DNA replication machinery, and retention of DNA:RNA hybrids, may result in genome instability. On the other hand, it has been proposed that active genes repair faster and preferentially via homologous recombination. Moreover, while canonical transcription is inhibited in the proximity of DNA double-strand breaks, a growing body of evidence supports active non-canonical transcription at DNA damage sites. Small non-coding RNAs accumulate at DNA double-strand break sites in mammals and other organisms, and are involved in DNA damage signaling and repair. Furthermore, RNA binding proteins are recruited to DNA damage sites and participate in the DNA damage response. Here, we discuss the impact of transcription on genome stability, the role of RNA binding proteins at DNA damage sites, and the function of small non-coding RNAs generated upon damage in the signaling and repair of DNA lesions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Stochastic Characterization of Communication Network Latency for Wide Area Grid Control Applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ameme, Dan Selorm Kwami; Guttromson, Ross

    This report characterizes communications network latency under various network topologies and qualities of service (QoS). The characterizations are probabilistic in nature, allowing deeper analysis of stability for Internet Protocol (IP) based feedback control systems used in grid applications. The work involves the use of Raspberry Pi computers as a proxy for a controlled resource, and an ns-3 network simulator on a Linux server to create an experimental platform (testbed) that can be used to model wide-area grid control network communications in smart grid. Modbus protocol is used for information transport, and Routing Information Protocol is used for dynamic route selectionmore » within the simulated network.« less

  11. Predicted reliability of aerospace electronics: Application of two advanced probabilistic concepts

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    Two advanced probabilistic design-for-reliability (PDfR) concepts are addressed and discussed in application to the prediction, quantification and assurance of the aerospace electronics reliability: 1) Boltzmann-Arrhenius-Zhurkov (BAZ) model, which is an extension of the currently widely used Arrhenius model and, in combination with the exponential law of reliability, enables one to obtain a simple, easy-to-use and physically meaningful formula for the evaluation of the probability of failure (PoF) of a material or a device after the given time in operation at the given temperature and under the given stress (not necessarily mechanical), and 2) Extreme Value Distribution (EVD) technique that can be used to assess the number of repetitive loadings that result in the material/device degradation and eventually lead to its failure by closing, in a step-wise fashion, the gap between the bearing capacity (stress-free activation energy) of the material or the device and the demand (loading). It is shown that the material degradation (aging, damage accumulation, flaw propagation, etc.) can be viewed, when BAZ model is considered, as a Markovian process, and that the BAZ model can be obtained as the ultimate steady-state solution to the well-known Fokker-Planck equation in the theory of Markovian processes. It is shown also that the BAZ model addresses the worst, but a reasonably conservative, situation. It is suggested therefore that the transient period preceding the condition addressed by the steady-state BAZ model need not be accounted for in engineering evaluations. However, when there is an interest in understanding the transient degradation process, the obtained solution to the Fokker-Planck equation can be used for this purpose. As to the EVD concept, it attributes the degradation process to the accumulation of damages caused by a train of repetitive high-level loadings, while loadings of levels that are considerably lower than their extreme values do not contribute- appreciably to the finite lifetime of a material or a device. In our probabilistic risk management (PRM) based analysis we treat the stress-free activation energy (capacity) as a normally distributed random variable, and choose, for the sake of simplicity, the (single-parametric) Rayleigh law as the basic distribution underlying the EVD. The general concepts addressed and discussed are illustrated by numerical examples. It is concluded that the application of the PDfR approach and particularly the above two advanced models should be considered as a natural, physically meaningful, informative, comprehensive, and insightful technique that reflects well the physics underlying the degradation processes in materials, devices and systems. It is the author's belief that they will be widely used in engineering practice, when high reliability is imperative, and the ability to quantify it is highly desirable.

  12. Scytonemin Plays a Potential Role in Stabilizing the Exopolysaccharidic Matrix in Terrestrial Cyanobacteria.

    PubMed

    Gao, Xiang

    2017-02-01

    Cyanobacteria are photosynthetic oxygen-evolving prokaryotes that are distributed in diverse habitats. They synthesize the ultraviolet (UV)-screening pigments, scytonemin (SCY) and mycosporine-like amino acids (MAAs), located in the exopolysaccharide (EPS) matrix. Multiple roles for both pigments have gradually been recognized, such as sunscreen ability, antioxidant activity, and heat dissipation from absorbed UV radiation. In this study, a filamentous terrestrial cyanobacterium Nostoc flagelliforme was used to evaluate the potential stabilizing role of SCY on the EPS matrix. SCY (∼3.7 %) was partially removed from N. flagelliforme filaments by rinsing with 100 % acetone for 5 s. The physiological damage to cells resulting from this treatment, in terms of photosystem II activity parameter Fv/Fm, was repaired after culturing the sample for 40 h. The physiologically recovered sample was further desiccated by natural or rapid drying and then allowed to recovery for 24 h. Compared with the normal sample, a relatively slower Fv/Fm recovery was observed in the SCY-partially removed sample, suggesting that the decreased SCY concentration in the EPS matrix caused cells to suffer further damage upon desiccation. In addition, the SCY-partially removed sample could allow the release of MAAs (∼25 %) from the EPS matrix, while the normal sample did not. Therefore, damage caused by drying of the former resulted from at least the reduction of structural stability of the EPS matrix as well as the loss of partial antioxidant compounds. Considering that an approximately 4 % loss of SCY led to this significant effect, the structurally stabilizing potential of SCY on the EPS matrix is crucial for terrestrial cyanobacteria survival in complex environments.

  13. Hard alpha-keratin degradation inside a tissue under high flux X-ray synchrotron micro-beam: a multi-scale time-resolved study.

    PubMed

    Leccia, Emilie; Gourrier, Aurélien; Doucet, Jean; Briki, Fatma

    2010-04-01

    X-rays interact strongly with biological organisms. Synchrotron radiation sources deliver very intense X-ray photon fluxes within micro- or submicro cross-section beams, resulting in doses larger than the MGy. The relevance of synchrotron radiation analyses of biological materials is therefore questionable since such doses, million times higher than the ones used in radiotherapy, can cause huge damages in tissues, with regard to not only DNA, but also proteic and lipid organizations. Very few data concerning the effect of very high X-ray doses in tissues are available in the literature. We present here an analysis of the structural phenomena which occur when the model tissue of human hair is irradiated by a synchrotron X-ray micro-beam. The choice of hair is supported by its hierarchical and partially ordered keratin structure which can be analysed inside the tissue by X-ray diffraction. To assess the damages caused by hard X-ray micro-beams (1 microm(2) cross-section), short exposure time scattering SAXS/WAXS patterns have been recorded at beamline ID13 (ESRF) after various irradiation times. Various modifications of the scattering patterns are observed, they provide fine insight of the radiation damages at various hierarchical levels and also unexpectedly provide information about the stability of the various hierarchical structural levels. It appears that the molecular level, i.e. the alpha helices which are stabilized by hydrogen bonds and the alpha-helical coiled coils which are stabilized by hydrophobic interactions, is more sensitive to radiation than the supramolecular architecture of the keratin filament and the filament packing within the keratin associated proteins matrix, which is stabilized by disulphide bonds. (c) 2009 Elsevier Inc. All rights reserved.

  14. NEK8 regulates DNA damage-induced RAD51 foci formation and replication fork protection

    PubMed Central

    Abeyta, Antonio; Castella, Maria; Jacquemont, Celine; Taniguchi, Toshiyasu

    2017-01-01

    ABSTRACT Proteins essential for homologous recombination play a pivotal role in the repair of DNA double strand breaks, DNA inter-strand crosslinks and replication fork stability. Defects in homologous recombination also play a critical role in the development of cancer and the sensitivity of these cancers to chemotherapy. RAD51, an essential factor for homologous recombination and replication fork protection, accumulates and forms immunocytochemically detectable nuclear foci at sites of DNA damage. To identify kinases that may regulate RAD51 localization to sites of DNA damage, we performed a human kinome siRNA library screen, using DNA damage-induced RAD51 foci formation as readout. We found that NEK8, a NIMA family kinase member, is required for efficient DNA damage-induced RAD51 foci formation. Interestingly, knockout of Nek8 in murine embryonic fibroblasts led to cellular sensitivity to the replication inhibitor, hydroxyurea, and inhibition of the ATR kinase. Furthermore, NEK8 was required for proper replication fork protection following replication stall with hydroxyurea. Loading of RAD51 to chromatin was decreased in NEK8-depleted cells and Nek8-knockout cells. Single-molecule DNA fiber analyses revealed that nascent DNA tracts were degraded in the absence of NEK8 following treatment with hydroxyurea. Consistent with this, Nek8-knockout cells showed increased chromosome breaks following treatment with hydroxyurea. Thus, NEK8 plays a critical role in replication fork stability through its regulation of the DNA repair and replication fork protection protein RAD51. PMID:27892797

  15. Msc1 acts through histone H2A.Z to promote chromosome stability in Schizosaccharomyces pombe.

    PubMed

    Ahmed, Shakil; Dul, Barbara; Qiu, Xinxing; Walworth, Nancy C

    2007-11-01

    As a central component of the DNA damage checkpoint pathway, the conserved protein kinase Chk1 mediates cell cycle progression when DNA damage is generated. Msc1 was identified as a multicopy suppressor capable of facilitating survival in response to DNA damage of cells mutant for chk1. We demonstrate that loss of msc1 function results in an increased rate of chromosome loss and that an msc1 null allele exhibits genetic interactions with mutants in key kinetochore components. Multicopy expression of msc1 robustly suppresses a temperature-sensitive mutant (cnp1-1) in the centromere-specific histone H3 variant CENP-A, and localization of CENP-A to the centromere is compromised in msc1 null cells. We present several lines of evidence to suggest that Msc1 carries out its function through the histone H2A variant H2A.Z, encoded by pht1 in fission yeast. Like an msc1 mutant, a pht1 mutant also exhibits chromosome instability and genetic interactions with kinetochore mutants. Suppression of cnp1-1 by multicopy msc1 requires pht1. Likewise, suppression of the DNA damage sensitivity of a chk1 mutant by multicopy msc1 also requires pht1. We present the first genetic evidence that histone H2A.Z may participate in centromere function in fission yeast and propose that Msc1 acts through H2A.Z to promote chromosome stability and cell survival following DNA damage.

  16. Msc1 Acts Through Histone H2A.Z to Promote Chromosome Stability in Schizosaccharomyces pombe

    PubMed Central

    Ahmed, Shakil; Dul, Barbara; Qiu, Xinxing; Walworth, Nancy C.

    2007-01-01

    As a central component of the DNA damage checkpoint pathway, the conserved protein kinase Chk1 mediates cell cycle progression when DNA damage is generated. Msc1 was identified as a multicopy suppressor capable of facilitating survival in response to DNA damage of cells mutant for chk1. We demonstrate that loss of msc1 function results in an increased rate of chromosome loss and that an msc1 null allele exhibits genetic interactions with mutants in key kinetochore components. Multicopy expression of msc1 robustly suppresses a temperature-sensitive mutant (cnp1-1) in the centromere-specific histone H3 variant CENP-A, and localization of CENP-A to the centromere is compromised in msc1 null cells. We present several lines of evidence to suggest that Msc1 carries out its function through the histone H2A variant H2A.Z, encoded by pht1 in fission yeast. Like an msc1 mutant, a pht1 mutant also exhibits chromosome instability and genetic interactions with kinetochore mutants. Suppression of cnp1-1 by multicopy msc1 requires pht1. Likewise, suppression of the DNA damage sensitivity of a chk1 mutant by multicopy msc1 also requires pht1. We present the first genetic evidence that histone H2A.Z may participate in centromere function in fission yeast and propose that Msc1 acts through H2A.Z to promote chromosome stability and cell survival following DNA damage. PMID:17947424

  17. NEK8 regulates DNA damage-induced RAD51 foci formation and replication fork protection.

    PubMed

    Abeyta, Antonio; Castella, Maria; Jacquemont, Celine; Taniguchi, Toshiyasu

    2017-02-16

    Proteins essential for homologous recombination play a pivotal role in the repair of DNA double strand breaks, DNA inter-strand crosslinks and replication fork stability. Defects in homologous recombination also play a critical role in the development of cancer and the sensitivity of these cancers to chemotherapy. RAD51, an essential factor for homologous recombination and replication fork protection, accumulates and forms immunocytochemically detectable nuclear foci at sites of DNA damage. To identify kinases that may regulate RAD51 localization to sites of DNA damage, we performed a human kinome siRNA library screen, using DNA damage-induced RAD51 foci formation as readout. We found that NEK8, a NIMA family kinase member, is required for efficient DNA damage-induced RAD51 foci formation. Interestingly, knockout of Nek8 in murine embryonic fibroblasts led to cellular sensitivity to the replication inhibitor, hydroxyurea, and inhibition of the ATR kinase. Furthermore, NEK8 was required for proper replication fork protection following replication stall with hydroxyurea. Loading of RAD51 to chromatin was decreased in NEK8-depleted cells and Nek8-knockout cells. Single-molecule DNA fiber analyses revealed that nascent DNA tracts were degraded in the absence of NEK8 following treatment with hydroxyurea. Consistent with this, Nek8-knockout cells showed increased chromosome breaks following treatment with hydroxyurea. Thus, NEK8 plays a critical role in replication fork stability through its regulation of the DNA repair and replication fork protection protein RAD51.

  18. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  19. Processing of probabilistic information in weight perception and motor prediction.

    PubMed

    Trampenau, Leif; van Eimeren, Thilo; Kuhtz-Buschbeck, Johann

    2017-02-01

    We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented.

  20. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  1. Probabilistic finite elements for fracture mechanics

    NASA Technical Reports Server (NTRS)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  3. A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography

    DTIC Science & Technology

    2010-04-01

    distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided

  4. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  5. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  6. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  7. Novel (188)Re multi-functional bone-seeking compounds: Synthesis, biological and radiotoxic effects in metastatic breast cancer cells.

    PubMed

    Fernandes, Célia; Monteiro, Sofia; Belchior, Ana; Marques, Fernanda; Gano, Lurdes; Correia, João D G; Santos, Isabel

    2016-02-01

    Radiolabeled bisphosphonates (BPs) have been used for bone imaging and delivery of β(-) emitting radionuclides for bone pain palliation. As a β(-) emitter, (188)Re has been considered particularly promising for bone metastases therapy. Aimed at finding innovative bone-seeking agents for systemic radiotherapy of bone metastases, we describe herein novel organometallic compounds of the type fac-[(188)Re(CO)3(k(3)-L)], (L=BP-containing chelator), their in vitro and in vivo stability, and their cellular damage in MDAMB231 cells, a metastatic breast cancer cell line. After synthesis and characterization of the novel organometallic compounds of the type fac-[(188)Re(CO)3(k(3)-L)] their radiochemical purity and in vitro stability was assessed by HPLC. In vivo stability and pharmacokinetic profile were evaluated in mice and the radiocytotoxic activity and DNA damage were assessed by MTT assay and by the cytokinesis-block micronucleus (CBMN) assay, respectively. Among all complexes, (188)Re3 was obtained with high radiochemical purity (>95%) and high specific activity and presented high in vitro and in vivo stability. Biodistribution studies of (188)Re3 in Balb/c mice showed fast blood clearance, high bone uptake (16.1 ± 3.3% IA/g organ, 1h p.i.) and high bone-to-blood and bone-to-muscle radioactivity ratios, indicating that it is able to deliver radiation to bone in a very selective way. The radiocytotoxic effect elicited by (188)Re3 in the MDAMB231 cells was dependent on its concentration, and was higher than that induced by identical concentrations of [(188)ReO4](-). Additionally, (188)Re3 elicited morphological changes in the cells and induced DNA damage by the increased number of MN observed. Altogether, our results demonstrate that (188)Re3 could be considered an attractive candidate for further preclinical evaluation for systemic radionuclide therapy of bone metastases considering its ability to deliver radiation to bone in a very selective way and to induce radiation damage. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.

    The US Nuclear Regulatory Commission (NRC) has been using full-power. Level 1, limited-scope risk models for the Accident Sequence Precursor (ASP) program for over fifteen years. These models have evolved and matured over the years, as have probabilistic risk assessment (PRA) and computer technologies. Significant upgrading activities have been undertaken over the past three years, with involvement from the Offices of Nuclear Reactor Regulation (NRR), Analysis and Evaluation of Operational Data (AEOD), and Nuclear Regulatory Research (RES), and several national laboratories. Part of these activities was an RES-sponsored feasibility study investigating the ability to extend the ASP models to includemore » contributors to core damage from events initiated with the reactor at low power or shutdown (LP/SD), both internal events and external events. This paper presents only the LP/SD internal event modeling efforts.« less

  9. Radiation-hardened optically reconfigurable gate array exploiting holographic memory characteristics

    NASA Astrophysics Data System (ADS)

    Seto, Daisaku; Watanabe, Minoru

    2015-09-01

    In this paper, we present a proposal for a radiation-hardened optically reconfigurable gate array (ORGA). The ORGA is a type of field programmable gate array (FPGA). The ORGA configuration can be executed by the exploitation of holographic memory characteristics even if 20% of the configuration data are damaged. Moreover, the optoelectronic technology enables the high-speed reconfiguration of the programmable gate array. Such a high-speed reconfiguration can increase the radiation tolerance of its programmable gate array to 9.3 × 104 times higher than that of current FPGAs. Through experimentation, this study clarified the configuration dependability using the impulse-noise emulation and high-speed configuration capabilities of the ORGA with corrupt configuration contexts. Moreover, the radiation tolerance of the programmable gate array was confirmed theoretically through probabilistic calculation.

  10. Risks from Solar Particle Events for Long Duration Space Missions Outside Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Over, S.; Myers, J.; Ford, J.

    2016-01-01

    The Integrated Medical Model (IMM) simulates the medical occurrences and mission outcomes for various mission profiles using probabilistic risk assessment techniques. As part of the work with the Integrated Medical Model (IMM), this project focuses on radiation risks from acute events during extended human missions outside low Earth orbit (LEO). Of primary importance in acute risk assessment are solar particle events (SPEs), which are low probability, high consequence events that could adversely affect mission outcomes through acute radiation damage to astronauts. SPEs can be further classified into coronal mass ejections (CMEs) and solar flares/impulsive events (Fig. 1). CMEs are an eruption of solar material and have shock enhancements that contribute to make these types of events higher in total fluence than impulsive events.

  11. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  12. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  13. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  14. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  15. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  16. Effects of climate change and UV-B on materials.

    PubMed

    Andrady, Anthony L; Hamid, Halim S; Torikai, Ayako

    2003-01-01

    The outdoor service life of common plastic materials is limited by their susceptibility to solar ultraviolet radiation. Of the solar wavelengths the UV-B component is particularly efficient in bringing about photodamage in synthetic and naturally occurring materials. This is particularly true of plastics, rubber and wood used in the building and agricultural industries. Any depletion in the stratospheric ozone layer and resulting increase in the UV-B component of terrestrial sunlight will therefore tend to decrease the service life of these materials. The extent to which the service life is reduced is, however, difficult to estimate as it depends on several factors. These include the chemical nature of the material, the additives it contains, the type and the amount of light-stabilizers (or protective coatings) used, and the amount of solar exposure it receives. Concomitant climate change is likely to increase the ambient temperature and humidity in some of the same regions likely to receive increased UV-B radiation. These factors, particularly higher temperatures, are also well known to accelerate the rate of photodegradation of materials, and may therefore further limit the service life of materials in these regions. To reliably assess the damage to materials as a consequence of ozone layer depletion, the wavelength sensitivity of the degradation process, dose-response relationships for the material and the effectiveness of available stabilizers need to be quantified. The data needed for the purpose are not readily available at this time for most of the commonly used plastics or wood materials. Wavelength sensitivity of a number of common plastic materials and natural biopolymers are available and generally show the damage (per photon) to decrease exponentially with the wavelength. Despite the relatively higher fraction of UV-A in sunlight, the UV-B content is responsible for a significant part of light-induced damage of materials. The primary approach to mitigation relies on the effectiveness of the existing light stabilizers (such as hindered amine light stabilizers, HALS) used in plastics exposed to harsh solar UV conditions coupled with climate change factors. In developing advanced light-stabilizer technologies, more light-resistant grades of common plastics, or surface protection technologies for wood, the harsh weathering environment created by the simultaneous action of increased UV-B levels due to ozone depletion as well as the relevant climate change factors need to be taken into consideration. Recent literature includes several studies on synergism of HALS-based stabilizers, stabilizer effectiveness in the new m-polyolefins and elucidation of the mechanism of stabilization afforded by titania pigment in vinyl plastics.

  17. Management of civilian ballistic fractures.

    PubMed

    Seng, V S; Masquelet, A C

    2013-12-01

    The management of ballistic fractures, which are open fractures, has often been studied in wartime and has benefited from the principles of military surgery with debridement and lavage, and the use of external fixation for bone stabilization. In civilian practice, bone stabilization of these fractures is different and is not performed by external fixation. Fifteen civilian ballistic fractures, Gustilo II or IIIa, two associated with nerve damage and none with vascular damage, were reviewed. After debridement and lavage, ten internal fixations and five conservative treatments were used. No superficial or deep surgical site infection was noted. Fourteen of the 15 fractures (93%) healed without reoperation. Eleven of the 15 patients (73%) regained normal function. Ballistic fractures have a bad reputation due to their many complications, including infections. In civilian practice, the use of internal fixation is not responsible for excessive morbidity, provided debridement and lavage are performed. Civilian ballistic fractures, when they are caused by low-velocity firearms, differ from military ballistic fractures. Although the principle of surgical debridement and lavage remains the same, bone stabilization is different and is similar to conventional open fractures. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Rehabilitation of patients with thoracic spine injury treated by spring alloplasty.

    PubMed

    Kiwerski, J

    1983-12-01

    Stabilization of the traumatic injured spine by means of springs, called spring alloplasty, was introduced into clinical practice by Professor M. Weiss in 1965 and has been applied in the Warsaw Medical Academy Rehabilitation Clinic ( Konstancin ) ever since. The springs here replace the damaged system of posterior ligaments of the spine, restoring its stability and alleviating the front (often damaged) part of the body. This method has been used in surgery on about 350 patients mainly with spinal injury in the thoracic and thoracolumbar levels. Spine stabilization by the method in question usually makes it possible to start an early verticalization and an active rehabilitation. The verticalization of the patient in a specially designed bed is introduced as early as a few days after the accident, and attempts at active verticalization are made in 2-3 weeks time after surgery, thus the rehabilitation process is substantially precipitated and the period of hospital treatment is significantly reduced. The methodology of rehabilitation of the patients in question has been presented and functional effects of the treatment have been discussed in the paper.

  19. Probabilistic approach to damage of tunnel lining due to fire

    NASA Astrophysics Data System (ADS)

    Šejnoha, Jiří; Sýkora, Jan; Novotná, Eva; Šejnoha, Michal

    2017-09-01

    In this paper, risk is perceived as the probable damage caused by a fire in the tunnel lining. In its first part the traffic flow is described as a Markov chain of joint states consisting of a combination of trucks/buses (TB) and personal cars (PC) from adjoining lanes. The heat release rate is then taken for a measure of the fire power. The intensity λf reflecting the frequency of fires was assessed based on extensive studies carried out in Austria [1] and Italy [2, 3]. The traffic density AADT, the length of the tunnel L, the percentage of TBs, and the number of lanes are the remaining parameters characterizing the traffic flow. In the second part, a special combination of models originally proposed by Bažant and Thonguthai [4], and Künzel & Kiessl [5] for the description of transport processes in concrete at very high temperatures creates a basis for the prediction of the thickness of the spalling zone and the volume of concrete degraded by temperatures that exceed a certain temperature level. The model was validated against a macroscopic test on concrete samples placed into the furnace.

  20. Stream stability at highway structures.

    DOT National Transportation Integrated Search

    1995-11-01

    This document provides guidelines for identifying stream instability problems at highway stream crossings and for the selection and design of appropriate countermeasures to mitigate potential damages to bridges and other highway components at stream ...

  1. 49 CFR 570.8 - Suspension systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... cracked. Structural parts shall not be bent or damaged. Stabilizer bars shall be connected. Springs shall..., shall be installed on both front springs, both rear springs, or on all four springs. Shock absorber...

  2. 49 CFR 570.8 - Suspension systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... cracked. Structural parts shall not be bent or damaged. Stabilizer bars shall be connected. Springs shall..., shall be installed on both front springs, both rear springs, or on all four springs. Shock absorber...

  3. 49 CFR 570.8 - Suspension systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... cracked. Structural parts shall not be bent or damaged. Stabilizer bars shall be connected. Springs shall..., shall be installed on both front springs, both rear springs, or on all four springs. Shock absorber...

  4. 49 CFR 570.8 - Suspension systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... cracked. Structural parts shall not be bent or damaged. Stabilizer bars shall be connected. Springs shall..., shall be installed on both front springs, both rear springs, or on all four springs. Shock absorber...

  5. Tradeoffs between water requirements and yield stability in annual vs. perennial crops

    NASA Astrophysics Data System (ADS)

    Vico, Giulia; Brunsell, Nathaniel A.

    2018-02-01

    Population growth and changes in climate and diets will likely further increase the pressure on agriculture and water resources globally. Currently, staple crops are obtained from annuals plants. A shift towards perennial crops may enhance many ecosystem services, but at the cost of higher water requirements and lower yields. It is still unclear when the advantages of perennial crops overcome their disadvantages and perennial crops are thus a sustainable solution. Here we combine a probabilistic description of the soil water balance and crop development with an extensive dataset of traits of congeneric annuals and perennials to identify the conditions for which perennial crops are more viable than annual ones with reference to yield, yield stability, and effective use of water. We show that the larger and more developed roots of perennial crops allow a better exploitation of soil water resources and a reduction of yield variability with respect to annual species, but their yields remain lower when considering grain crops. Furthermore, perennial crops have higher and more variable irrigation requirements and lower water productivity. These results are important to understand the potential consequences for yield, its stability, and water resource use of a shift from annual to perennial crops and, more generally, if perennial crops may be more resilient than annual crops in the face of climatic fluctuations.

  6. Saturation Length of Erodible Sediment Beds Subject to Shear Flow

    NASA Astrophysics Data System (ADS)

    Casler, D. M.; Kahn, B. P.; Furbish, D. J.; Schmeeckle, M. W.

    2016-12-01

    We examine the initial growth and wavelength selection of sand ripples based on probabilistic formulations of the flux and the Exner equation. Current formulations of this problem as a linear stability analysis appeal to the idea of a saturation length-the lag between the bed stress and the flux-as a key stabilizing influence leading to selection of a finite wavelength. We present two contrasting formulations. The first is based on the Fokker-Planck approximation of the divergence form of the Exner equation, and thus involves particle diffusion associated with variations in particle activity, in addition to the conventionally assumed advective term. The role of a saturation length associated with the particle activity is similar to previous analyses. However, particle diffusion provides an attenuating influence on the growth of small wavelengths, independent of a saturation length, and is thus a sufficient, if not necessary, condition contributing to selection of a finite wavelength. The second formulation is based on the entrainment form of the Exner equation. As a precise, probabilistic formulation of conservation, this form of the Exner equation does not distinguish between advection and diffusion, and, because it directly accounts for all particle motions via a convolution of the distribution of particle hop distances, it pays no attention to the idea of a saturation length. The formulation and resulting description of initial ripple growth and wavelength selection thus inherently subsume the effects embodied in the ideas of advection, diffusion, and a saturation length as used in other formulations. Moreover, the formulation does not distinguish between bed load and suspended load, and is thus broader in application. The analysis reveals that the length scales defined by the distribution of hop distances are more fundamental than the saturation length in determining the initial growth or decay of bedforms. Formulations involving the saturation length coincide with the special case of an exponential distribution of hop distance, where the saturation length is equal to the mean hop distance.

  7. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    NASA Astrophysics Data System (ADS)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data quality checking and integrity. In this study, TEMAN data were further analysed and subsequently integrated with landslide susceptible map for Gua Tempurung and Jelapang area in Perak.

  8. Level 1 Tornado PRA for the High Flux Beam Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozoki, G.E.; Conrad, C.S.

    This report describes a risk analysis primarily directed at providing an estimate for the frequency of tornado induced damage to the core of the High Flux Beam Reactor (HFBR), and thus it constitutes a Level 1 Probabilistic Risk Assessment (PRA) covering tornado induced accident sequences. The basic methodology of the risk analysis was to develop a ``tornado specific`` plant logic model that integrates the internal random hardware failures with failures caused externally by the tornado strike and includes operator errors worsened by the tornado modified environment. The tornado hazard frequency, as well as earlier prepared structural and equipment fragility data,more » were used as input data to the model. To keep modeling/calculational complexity as simple as reasonable a ``bounding`` type, slightly conservative, approach was applied. By a thorough screening process a single dominant initiating event was selected as a representative initiator, defined as: ``Tornado Induced Loss of Offsite Power.`` The frequency of this initiator was determined to be 6.37E-5/year. The safety response of the HFBR facility resulted in a total Conditional Core Damage Probability of .621. Thus, the point estimate of the HFBR`s Tornado Induced Core Damage Frequency (CDF) was found to be: (CDF){sub Tornado} = 3.96E-5/year. This value represents only 7.8% of the internal CDF and thus is considered to be a small contribution to the overall facility risk expressed in terms of total Core Damage Frequency. In addition to providing the estimate of (CDF){sub Tornado}, the report documents, the relative importance of various tornado induced system, component, and operator failures that contribute most to (CDF){sub Tornado}.« less

  9. Estimation of potential impacts and natural resource damages of oil.

    PubMed

    McCay, Deborah French; Rowe, Jill Jennings; Whittier, Nicole; Sankaranarayanan, Sankar; Etkin, Dagmar Schmidt

    2004-02-27

    Methods were developed to estimate the potential impacts and natural resource damages resulting from oil spills using probabilistic modeling techniques. The oil fates model uses wind data, current data, and transport and weathering algorithms to calculate mass balance of fuel components in various environmental compartments (water surface, shoreline, water column, atmosphere, sediments, etc.), oil pathway over time (trajectory), surface distribution, shoreline oiling, and concentrations of the fuel components in water and sediments. Exposure of aquatic habitats and organisms to whole oil and toxic components is estimated in the biological model, followed by estimation of resulting acute mortality and ecological losses. Natural resource damages are based on estimated costs to restore equivalent resources and/or ecological services, using Habitat Equivalency Analysis (HEA) and Resource Equivalency Analysis (REA) methods. Oil spill modeling was performed for two spill sites in central San Francisco Bay, three spill sizes (20th, 50th, and 95th percentile volumes from tankers and larger freight vessels, based on an analysis of likely spill volumes given a spill has occurred) and four oil types (gasoline, diesel, heavy fuel oil, and crude oil). The scenarios were run in stochastic mode to determine the frequency distribution, mean and standard deviation of fates, impacts, and damages. This work is significant as it demonstrates a statistically quantifiable method for estimating potential impacts and financial consequences that may be used in ecological risk assessment and cost-benefit analyses. The statistically-defined spill volumes and consequences provide an objective measure of the magnitude, range and variability of impacts to wildlife, aquatic organisms and shorelines for potential spills of four oil/fuel types, each having distinct environmental fates and effects.

  10. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  11. Tankyrase1-mediated poly(ADP-ribosyl)ation of TRF1 maintains cell survival after telomeric DNA damage

    PubMed Central

    Yang, Lu; Sun, Luxi; Teng, Yaqun; Chen, Hao; Gao, Ying; Levine, Arthur S.; Nakajima, Satoshi

    2017-01-01

    Abstract Oxidative DNA damage triggers telomere erosion and cellular senescence. However, how repair is initiated at telomeres is largely unknown. Here, we found unlike PARP1-mediated Poly-ADP-Ribosylation (PARylation) at genomic damage sites, PARylation at telomeres is mainly dependent on tankyrase1 (TNKS1). TNKS1 is recruited to damaged telomeres via its interaction with TRF1, which subsequently facilitates the PARylation of TRF1 after damage. TNKS inhibition abolishes the recruitment of the repair proteins XRCC1 and polymerase β at damaged telomeres, while the PARP1/2 inhibitor only has such an effect at non-telomeric damage sites. The ANK domain of TNKS1 is essential for the telomeric damage response and TRF1 interaction. Mutation of the tankyrase-binding motif (TBM) on TRF1 (13R/18G to AA) disrupts its interaction with TNKS1 concomitant recruitment of TNKS1 and repair proteins after damage. Either TNKS1 inhibition or TBM mutated TRF1 expression markedly sensitizes cells to telomere oxidative damage as well as XRCC1 inhibition. Together, our data reveal a novel role of TNKS1 in facilitating SSBR at damaged telomeres through PARylation of TRF1, thereby protecting genome stability and cell viability. PMID:28160604

  12. Analysis of scale effect in compressive ice failure and implications for design

    NASA Astrophysics Data System (ADS)

    Taylor, Rocky Scott

    The main focus of the study was the analysis of scale effect in local ice pressure resulting from probabilistic (spalling) fracture and the relationship between local and global loads due to the averaging of pressures across the width of a structure. A review of fundamental theory, relevant ice mechanics and a critical analysis of data and theory related to the scale dependent pressure behavior of ice were completed. To study high pressure zones (hpzs), data from small-scale indentation tests carried out at the NRC-IOT were analyzed, including small-scale ice block and ice sheet tests. Finite element analysis was used to model a sample ice block indentation event using a damaging, viscoelastic material model and element removal techniques (for spalling). Medium scale tactile sensor data from the Japan Ocean Industries Association (JOIA) program were analyzed to study details of hpz behavior. The averaging of non-simultaneous hpz loads during an ice-structure interaction was examined using local panel pressure data. Probabilistic averaging methodology for extrapolating full-scale pressures from local panel pressures was studied and an improved correlation model was formulated. Panel correlations for high speed events were observed to be lower than panel correlations for low speed events. Global pressure estimates based on probabilistic averaging were found to give substantially lower average errors in estimation of load compared with methods based on linear extrapolation (no averaging). Panel correlations were analyzed for Molikpaq and compared with JOIA results. From this analysis, it was shown that averaging does result in decreasing pressure for increasing structure width. The relationship between local pressure and ice thickness for a panel of unit width was studied in detail using full-scale data from the STRICE, Molikpaq, Cook Inlet and Japan Ocean Industries Association (JOIA) data sets. A distinct trend of decreasing pressure with increasing ice thickness was observed. The pressure-thickness behavior was found to be well modeled by the power law relationships Pavg = 0.278 h-0.408 MPa and Pstd = 0.172h-0.273 MPa for the mean and standard deviation of pressure, respectively. To study theoretical aspects of spalling fracture and the pressure-thickness scale effect, probabilistic failure models have been developed. A probabilistic model based on Weibull theory (tensile stresses only) was first developed. Estimates of failure pressure obtained with this model were orders of magnitude higher than the pressures observed from benchmark data due to the assumption of only tensile failure. A probabilistic fracture mechanics (PFM) model including both tensile and compressive (shear) cracks was developed. Criteria for unstable fracture in tensile and compressive (shear) zones were given. From these results a clear theoretical scale effect in peak (spalling) pressure was observed. This scale effect followed the relationship Pp,th = 0.15h-0.50 MPa which agreed well with the benchmark data. The PFM model was applied to study the effect of ice edge shape (taper angle) and hpz eccentricity. Results indicated that specimens with flat edges spall at lower pressures while those with more tapered edges spall less readily. The mean peak (failure) pressure was also observed to decrease with increased eccentricity. It was concluded that hpzs centered about the middle of the ice thickness are the zones most likely to create the peak pressures that are of interest in design. Promising results were obtained using the PFM model, which provides strong support for continued research in the development and application of probabilistic fracture mechanics to the study of scale effects in compressive ice failure and to guide the development of methods for the estimation of design ice pressures.

  13. DISCOUNTING OF DELAYED AND PROBABILISTIC LOSSES OVER A WIDE RANGE OF AMOUNTS

    PubMed Central

    Green, Leonard; Myerson, Joel; Oliveira, Luís; Chang, Seo Eun

    2014-01-01

    The present study examined delay and probability discounting of hypothetical monetary losses over a wide range of amounts (from $20 to $500,000) in order to determine how amount affects the parameters of the hyperboloid discounting function. In separate conditions, college students chose between immediate payments and larger, delayed payments and between certain payments and larger, probabilistic payments. The hyperboloid function accurately described both types of discounting, and amount of loss had little or no systematic effect on the degree of discounting. Importantly, the amount of loss also had little systematic effect on either the rate parameter or the exponent of the delay and probability discounting functions. The finding that the parameters of the hyperboloid function remain relatively constant across a wide range of amounts of delayed and probabilistic loss stands in contrast to the robust amount effects observed with delayed and probabilistic rewards. At the individual level, the degree to which delayed losses were discounted was uncorrelated with the degree to which probabilistic losses were discounted, and delay and probability loaded on two separate factors, similar to what is observed with delayed and probabilistic rewards. Taken together, these findings argue that although delay and probability discounting involve fundamentally different decision-making mechanisms, nevertheless the discounting of delayed and probabilistic losses share an insensitivity to amount that distinguishes it from the discounting of delayed and probabilistic gains. PMID:24745086

  14. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  15. Low-cost rural surface alternatives : demonstration project : [tech transfer summary].

    DOT National Transportation Integrated Search

    2015-06-01

    Identify the most effective and economical methods for preventing or : mitigating freeze-thaw damage to granular surfaced roads in seasonally : cold regions : Construct demonstration test sections using several stabilization : methods recomme...

  16. Enhanced thermomechanical stability on laser-induced damage by functionally graded layers in quasi-rugate filters

    NASA Astrophysics Data System (ADS)

    Pu, Yunti; Ma, Ping; Lv, Liang; Zhang, Mingxiao; Lu, Zhongwen; Qiao, Zhao; Qiu, Fuming

    2018-05-01

    Ta2O5-SiO2 quasi-rugate filters with a reasonable optimization of rugate notch filter design were prepared by ion-beam sputtering. The optical properties and laser-induced damage threshold are studied. Compared with the spectrum of HL-stacks, the spectrum of quasi-rugate filters have weaker second harmonic peaks and narrower stopbands. According to the effect of functionally graded layers (FGLs), 1-on-1 and S-on-1 Laser induced damage threshold (LIDT) of quasi-rugate filters are about 22% and 50% higher than those of HL stacks, respectively. Through the analysis of the damage morphologies, laser-induced damage of films under nanosecond multi-pulse are dominated by a combination of thermal shock stress and thermomechanical instability due to nodules. Compared with catastrophic damages, the damage sits of quasi-rugate filters are developed in a moderate way. The damage growth behavior of defect-induced damage sites have been effectively restrained by the structure of FGLs. Generally, FGLs are used to reduce thermal stress by the similar thermal-expansion coefficients of neighboring layers and solve the problems such as instability and cracking raised by the interface discontinuity of nodular boundaries, respectively.

  17. Expression Profile of DNA Damage Signaling Genes in Proton Exposed Mouse Brain

    NASA Astrophysics Data System (ADS)

    Ramesh, Govindarajan; Wu, Honglu

    Exposure of living systems to radiation results in a wide assortment of lesions, the most signif-icant of is damage to genomic DNA which induce several cellular functions such as cell cycle arrest, repair, apoptosis etc. The radiation induced DNA damage investigation is one of the im-portant area in biology, but still the information available regarding the effects of proton is very limited. In this report, we investigated the differential gene expression pattern of DNA damage signaling genes particularly, damaged DNA binding, repair, cell cycle arrest, checkpoints and apoptosis using quantitative real-time RT-PCR array in proton exposed mouse brain tissues. The expression profiles showed significant changes in DNA damage related genes in 2Gy proton exposed mouse brain tissues as compared with control brain tissues. Furthermore, we also show that significantly increased levels of apoptotic related genes, caspase-3 and 8 activities in these cells, suggesting that in addition to differential expression of DNA damage genes, the alteration of apoptosis related genes may also contribute to the radiation induced DNA damage followed by programmed cell death. In summary, our findings suggest that proton exposed brain tissue undergo severe DNA damage which in turn destabilize the chromatin stability.

  18. Adaptive correlation filter-based video stabilization without accumulative global motion estimation

    NASA Astrophysics Data System (ADS)

    Koh, Eunjin; Lee, Chanyong; Jeong, Dong Gil

    2014-12-01

    We present a digital video stabilization approach that provides both robustness and efficiency for practical applications. In this approach, we adopt a stabilization model that maintains spatio-temporal information of past input frames efficiently and can track original stabilization position. Because of the stabilization model, the proposed method does not need accumulative global motion estimation and can recover the original position even if there is a failure in interframe motion estimation. It can also intelligently overcome the situation of damaged or interrupted video sequences. Moreover, because it is simple and suitable to parallel scheme, we implement it on a commercial field programmable gate array and a graphics processing unit board with compute unified device architecture in a breeze. Experimental results show that the proposed approach is both fast and robust.

  19. Seismic vulnerability: theory and application to Algerian buildings

    NASA Astrophysics Data System (ADS)

    Mebarki, Ahmed; Boukri, Mehdi; Laribi, Abderrahmane; Farsi, Mohammed; Belazougui, Mohamed; Kharchi, Fattoum

    2014-04-01

    When dealing with structural damages, under the effect of natural hazards such as earthquakes, it is still a scientific challenge to predict the potential damages, before occurrence of a given hazard, as well as to evaluate the damages once the earthquake has occurred. In the present study, two distinct methods addressing these topics are developed. Thousands (˜54,000) of existing buildings damaged during the Boumerdes earthquake that occurred in Algeria (Mw = 6.8, May 21, 2003) are considered in order to study their accuracy and sensitivity. Once an earthquake has occurred, quick evaluations of the damages are required in order to distinguish which structures should be demolished or evacuated immediately from those which can be kept in service without evacuation of its inhabitants. For this purpose, visual inspections are performed by trained and qualified engineers. For the case of Algeria, an evaluation form has been developed and is still in use since the early 80s: Five categories of damages are considered (no damage or very slight, slight, moderate, major, and very severe/collapse). This paper develops a theoretical methodology that processes the observed damages caused to the structural and nonstructural components (foundations, roofs, slabs, walls, beams, columns, fillings, partition walls, stairways, balconies, etc.), in order to help the evaluator to derive the global damage evaluation. This theoretical methodology transforms the damage category into a corresponding "residual" risk of failure ranging from zero (no damage) to one (complete damage). The global failure risk, in fact its corresponding damage category, is then derived according to given combinations of probabilistic events in order to express the influence of any component on the global damage and behavior. The method is calibrated on a set of ˜54,000 buildings inspected after Boumerdes earthquake. Almost 80 % of accordance (same damage category) is obtained, when comparing the theoretical results to the observed damages. For pre-earthquake analysis, the methodology widely used around the world relies on the prior calibration of the seismic response of the structures under given expected scenarios. As the structural response is governed by the constitutive materials and structural typology as well as the seismic input and soil conditions, the damage prediction depends intimately on the accuracy of the so-called fragility curve and response spectrum established for each type of structure (RC framed structures, confined or unconfined masonry, etc.) and soil (hard rock, soft soil, etc.). In the present study, the adaptation to Algerian buildings concerns the specific soil conditions as well as the structural dynamic response. The theoretical prediction of the expected damages is helpful for the calibration of the methodology. Thousands (˜3,700) of real structures and the damages caused by the earthquake (Algeria, Boumerdes: Mw = 6.8, May 21, 2003) are considered for the a posteriori calibration and validation process. The theoretical predictions show the importance of the elastic response spectrum, the local soil conditions, and the structural typology. Although the observed and predicted categories of damage are close, it appears that the existing form used for the visual damage inspection would still require further improvements, in order to allow easy evaluation and identification of the damage level. These methods coupled to databases, and GIS tools could be helpful for the local and technical authorities during the post-earthquake evaluation process: real time information on the damage extent at urban or regional scales as well as the extent of losses and the required resources for reconstruction, evacuation, strengthening, etc.

  20. A core hSSB1–INTS complex participates in the DNA damage response

    PubMed Central

    Zhang, Feng; Ma, Teng; Yu, Xiaochun

    2013-01-01

    Summary Human single-stranded DNA-binding protein 1 (hSSB1) plays an important role in the DNA damage response and the maintenance of genomic stability. It has been shown that the core hSSB1 complex contains hSSB1, INTS3 and C9orf80. Using protein affinity purification, we have identified integrator complex subunit 6 (INTS6) as a major subunit of the core hSSB1 complex. INTS6 forms a stable complex with INTS3 and hSSB1 both in vitro and in vivo. In this complex, INTS6 directly interacts with INTS3. In response to the DNA damage response, along with INTS3 and hSSB1, INTS6 relocates to the DNA damage sites. Moreover, the hSSB1–INTS complex regulates the accumulation of RAD51 and BRCA1 at DNA damage sites and the correlated homologous recombination. PMID:23986477

Top