Science.gov

Sample records for probabilistic damage tolerance

  1. Multidisciplinary design optimization of a fighter aircraft with damage tolerance constraints and a probabilistic model of the fatigue environment

    NASA Astrophysics Data System (ADS)

    Arrieta, Albert Joseph

    2001-07-01

    Damage tolerance analysis (DTA) was considered in the global design optimization of an aircraft wing structure. Residual strength and fatigue life requirements, based on the damage tolerance philosophy, were investigated as new design constraints. In general, accurate fatigue prediction is difficult if the load environment is not known with a high degree of certainty. To address this issue, a probabilistic approach was used to describe the uncertain load environment. Probabilistic load spectra models were developed from flight recorder data. The global/local finite element approach allowed local fatigue requirements to be considered in the global design optimization. AFGROW fatigue crack growth analysis provided a new strength criterion for satisfying damage tolerance requirements within a global optimization environment. Initial research with the ASTROS program used the probabilistic load model and this damage tolerance constraint to optimize cracked skin panels on the lower wing of a fighter/attack aircraft. For an aerodynamic and structural model similar to an F-16, ASTROS simulated symmetric and asymmetric maneuvers during the optimization. Symmetric maneuvers, without underwing stores, produced the highest stresses and drove the optimization of the inboard lower wing skin. Asymmetric maneuvers, with underwing stores, affected the optimum thickness of the outboard hard points. Subsequent design optimizations included von Mises stress, aileron effectiveness, and lift effectiveness constraints simultaneously. This optimization was driven by the DTA and von Mises stress constraints and, therefore, DTA requirements can have an active role to play in preliminary aircraft design.

  2. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  3. Certification of damage tolerant composite structure

    NASA Technical Reports Server (NTRS)

    Rapoff, Andrew J.; Dill, Harold D.; Sanger, Kenneth B.; Kautz, Edward F.

    1990-01-01

    A reliability based certification testing methodology for impact damage tolerant composite structure was developed. Cocured, adhesively bonded, and impact damaged composite static strength and fatigue life data were statistically analyzed to determine the influence of test parameters on the data scatter. The impact damage resistance and damage tolerance of various structural configurations were characterized through the analysis of an industry wide database of impact test results. Realistic impact damage certification requirements were proposed based on actual fleet aircraft data. The capabilities of available impact damage analysis methods were determined through correlation with experimental data. Probabilistic methods were developed to estimate the reliability of impact damaged composite structures.

  4. Damage Tolerance of Composites

    NASA Technical Reports Server (NTRS)

    Hodge, Andy

    2007-01-01

    Fracture control requirements have been developed to address damage tolerance of composites for manned space flight hardware. The requirements provide the framework for critical and noncritical hardware assessment and testing. The need for damage threat assessments, impact damage protection plans, and nondestructive evaluation are also addressed. Hardware intended to be damage tolerant have extensive coupon, sub-element, and full-scale testing requirements in-line with the Building Block Approach concept from the MIL-HDBK-17, Department of Defense Composite Materials Handbook.

  5. Composites Damage Tolerance Workshop

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne

    2006-01-01

    The Composite Damage Tolerance Workshop included participants from NASA, academia, and private industry. The objectives of the workshop were to begin dialogue in order to establish a working group within the Agency, create awareness of damage tolerance requirements for Constellation, and discuss potential composite hardware for the Crew Launch Vehicle (CLV) Upper Stage (US) and Crew Module. It was proposed that a composites damage tolerance working group be created that acts within the framework of the existing NASA Fracture Control Methodology Panel. The working group charter would be to identify damage tolerance gaps and obstacles for implementation of composite structures into manned space flight systems and to develop strategies and recommendations to overcome these obstacles.

  6. Damage Tolerance Assessment Branch

    NASA Technical Reports Server (NTRS)

    Walker, James L.

    2013-01-01

    The Damage Tolerance Assessment Branch evaluates the ability of a structure to perform reliably throughout its service life in the presence of a defect, crack, or other form of damage. Such assessment is fundamental to the use of structural materials and requires an integral blend of materials engineering, fracture testing and analysis, and nondestructive evaluation. The vision of the Branch is to increase the safety of manned space flight by improving the fracture control and the associated nondestructive evaluation processes through development and application of standards, guidelines, advanced test and analytical methods. The Branch also strives to assist and solve non-aerospace related NDE and damage tolerance problems, providing consultation, prototyping and inspection services.

  7. Damage identification with probabilistic neural networks

    SciTech Connect

    Klenke, S.E.; Paez, T.L.

    1995-12-01

    This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework, it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  8. Mechanism of DNA damage tolerance.

    PubMed

    Bi, Xin

    2015-08-26

    DNA damage may compromise genome integrity and lead to cell death. Cells have evolved a variety of processes to respond to DNA damage including damage repair and tolerance mechanisms, as well as damage checkpoints. The DNA damage tolerance (DDT) pathway promotes the bypass of single-stranded DNA lesions encountered by DNA polymerases during DNA replication. This prevents the stalling of DNA replication. Two mechanistically distinct DDT branches have been characterized. One is translesion synthesis (TLS) in which a replicative DNA polymerase is temporarily replaced by a specialized TLS polymerase that has the ability to replicate across DNA lesions. TLS is mechanistically simple and straightforward, but it is intrinsically error-prone. The other is the error-free template switching (TS) mechanism in which the stalled nascent strand switches from the damaged template to the undamaged newly synthesized sister strand for extension past the lesion. Error-free TS is a complex but preferable process for bypassing DNA lesions. However, our current understanding of this pathway is sketchy. An increasing number of factors are being found to participate or regulate this important mechanism, which is the focus of this editorial. PMID:26322163

  9. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  10. Derivation Of Probabilistic Damage Definitions From High Fidelity Deterministic Computations

    SciTech Connect

    Leininger, L D

    2004-10-26

    This paper summarizes a methodology used by the Underground Analysis and Planning System (UGAPS) at Lawrence Livermore National Laboratory (LLNL) for the derivation of probabilistic damage curves for US Strategic Command (USSTRATCOM). UGAPS uses high fidelity finite element and discrete element codes on the massively parallel supercomputers to predict damage to underground structures from military interdiction scenarios. These deterministic calculations can be riddled with uncertainty, especially when intelligence, the basis for this modeling, is uncertain. The technique presented here attempts to account for this uncertainty by bounding the problem with reasonable cases and using those bounding cases as a statistical sample. Probability of damage curves are computed and represented that account for uncertainty within the sample and enable the war planner to make informed decisions. This work is flexible enough to incorporate any desired damage mechanism and can utilize the variety of finite element and discrete element codes within the national laboratory and government contractor community.

  11. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  12. 77 FR 4890 - Damage Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures, and Damage Tolerance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... Composite Rotorcraft Structures'' (76 FR 74655), published December 1, 2011, and ``Damage Tolerance and... Tolerance and Fatigue Evaluation for Composite Rotorcraft Structures'' (76 FR 74655). On December 2, 2011... Structures'' (76 FR 75435). In the ``Composite Rotorcraft Structures'' final rule, the FAA amended...

  13. Damage Tolerance of Composite Laminates from an Empirical Perspective

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    Damage tolerance consists of analysis and experimentation working together. Impact damage is usually of most concern for laminated composites. Once impacted, the residual compression strength is usually of most interest. Other properties may be of more interest than compression (application dependent). A damage tolerance program is application specific (not everyone is building aircraft). The "Building Block Approach" is suggested for damage tolerance. Advantage can be taken of the excellent fatigue resistance of damaged laminates to save time and costs.

  14. Low cost damage tolerant composite fabrication

    NASA Technical Reports Server (NTRS)

    Palmer, R. J.; Freeman, W. T.

    1988-01-01

    The resin transfer molding (RTM) process applied to composite aircraft parts offers the potential for using low cost resin systems with dry graphite fabrics that can be significantly less expensive than prepreg tape fabricated components. Stitched graphite fabric composites have demonstrated compression after impact failure performance that equals or exceeds that of thermoplastic or tough thermoset matrix composites. This paper reviews methods developed to fabricate complex shape composite parts using stitched graphite fabrics to increase damage tolerance with RTM processes to reduce fabrication cost.

  15. Damage tolerant design using collapse techniques

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1982-01-01

    A new approach to the design of structures for improved global damage tolerance is presented. In its undamaged condition the structure is designed subject to strength, displacement and buckling constraints. In the damaged condition the only constraint is that the structure will not collapse. The collapse load calculation is formulated as a maximization problem and solved by an interior extended penalty function. The design for minimum weight subject to constraints on the undamaged structure and a specified level of the collapse load is a minimization problem which is also solved by a penalty function formulation. Thus the overall problem is of a nested or multilevel optimization. Examples are presented to demonstrate the difference between the present and more traditional approaches.

  16. Damage-tolerance strategies for nacre tablets.

    PubMed

    Wang, Shengnan; Zhu, Xinqiao; Li, Qiyang; Wang, Rizhi; Wang, Xiaoxiang

    2016-05-01

    Nacre, a natural armor, exhibits prominent penetration resistance against predatory attacks. Unraveling its hierarchical toughening mechanisms and damage-tolerance design strategies may provide significant inspiration for the pursuit of high-performance artificial armors. In this work, relationships between the structure and mechanical performance of nacre were investigated. The results show that other than their brick-and-mortar structure, individual nacre tablets significantly contribute to the damage localization of nacre. Affected by intracrystalline organics, the tablets exhibit a unique fracture behavior. The synergistic action of the nanoscale deformation mechanisms increases the energy dissipation efficiency of the tablets and contributes to the preservation of the structural and functional integrity of the shell. PMID:26892674

  17. High damage tolerance of electrochemically lithiated silicon

    SciTech Connect

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.

  18. High damage tolerance of electrochemically lithiated silicon

    DOE PAGESBeta

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; et al

    2015-09-24

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. In this paper, we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratiomore » is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Finally, our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries.« less

  19. High damage tolerance of electrochemically lithiated silicon

    PubMed Central

    Wang, Xueju; Fan, Feifei; Wang, Jiangwei; Wang, Haoran; Tao, Siyu; Yang, Avery; Liu, Yang; Beng Chew, Huck; Mao, Scott X.; Zhu, Ting; Xia, Shuman

    2015-01-01

    Mechanical degradation and resultant capacity fade in high-capacity electrode materials critically hinder their use in high-performance rechargeable batteries. Despite tremendous efforts devoted to the study of the electro–chemo–mechanical behaviours of high-capacity electrode materials, their fracture properties and mechanisms remain largely unknown. Here we report a nanomechanical study on the damage tolerance of electrochemically lithiated silicon. Our in situ transmission electron microscopy experiments reveal a striking contrast of brittle fracture in pristine silicon versus ductile tensile deformation in fully lithiated silicon. Quantitative fracture toughness measurements by nanoindentation show a rapid brittle-to-ductile transition of fracture as the lithium-to-silicon molar ratio is increased to above 1.5. Molecular dynamics simulations elucidate the mechanistic underpinnings of the brittle-to-ductile transition governed by atomic bonding and lithiation-induced toughening. Our results reveal the high damage tolerance in amorphous lithium-rich silicon alloys and have important implications for the development of durable rechargeable batteries. PMID:26400671

  20. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  1. A probabilistic model for the fault tolerance of multilayer perceptrons.

    PubMed

    Merchawi, N S; Kumara, S T; Das, C R

    1996-01-01

    This paper presents a theoretical approach to determine the probability of misclassification of the multilayer perceptron (MLP) neural model, subject to weight errors. The type of applications considered are classification/recognition tasks involving binary input-output mappings. The analytical models are validated via simulation of a small illustrative example. The theoretical results, in agreement with simulation results, show that, for the example considered, Gaussian weight errors of standard deviation up to 22% of the weight value can be tolerated. The theoretical method developed here adds predictability to the fault tolerance capability of neural nets and shows that this capability is heavily dependent on the problem data.

  2. Damage Tolerance Analysis of a Pressurized Liquid Oxygen Tank

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Harvin, Stephen F.; Gregory, Peyton B.; Mason, Brian H.; Thompson, Joe E.; Hoffman, Eric K.

    2006-01-01

    A damage tolerance assessment was conducted of an 8,000 gallon pressurized Liquid Oxygen (LOX) tank. The LOX tank is constructed of a stainless steel pressure vessel enclosed by a thermal-insulating vacuum jacket. The vessel is pressurized to 2,250 psi with gaseous nitrogen resulting in both thermal and pressure stresses on the tank wall. Finite element analyses were performed on the tank to characterize the stresses from operation. Engineering material data was found from both the construction of the tank and the technical literature. An initial damage state was assumed based on records of a nondestructive inspection performed on the tank. The damage tolerance analyses were conducted using the NASGRO computer code. This paper contains the assumptions, and justifications, made for the input parameters to the damage tolerance analyses and the results of the damage tolerance analyses with a discussion on the operational safety of the LOX tank.

  3. Damage-Tolerant Composites Made By Stitching Carbon Fabrics

    NASA Technical Reports Server (NTRS)

    Dow, Marvin B.; Smith, Donald L.

    1992-01-01

    Work conducted at NASA Langley Research Center to investigate stitching combined with resin transfer molding to make composites more tolerant of damage and potentially cost competitive with metals. Composite materials tailored for damage tolerance by stitching layers of dry carbon fabric with closely spaced threads to provide reinforcement through thickness. Epoxy resin then infused into stitched preforms, and epoxy was cured. Various stitching patterns and thread materials evaluated by use of flat plate specimens. Also, blade-stiffened structural elements fabricated and tested. Stitched flat laminates showed outstanding damage tolerance, excellent compression strength in notched specimens, and acceptable fatigue behavior. Development of particular interest to aircraft and automotive industries.

  4. Damage-tolerant composite materials produced by stitching carbon fibers

    NASA Technical Reports Server (NTRS)

    Dow, Marvin B.; Smith, Donald L.

    1989-01-01

    NASA-Langley has undertaken the investigation of composite damage-tolerance enhancement and fabrication economies-maximization via reinforcement-stitching, in combination with resin transfer molding. Attention is given to results obtained by an experimental evaluation of composites tailored for damage tolerance by stitching layers of dry carbon-fiber fabric with closely-spaced threads, in order to furnish through-the-thickness reinforcement. Various stitching patterns and thread materials have been evaluated, using flat-plate specimens; blade-stiffened structural elements have been fabricated and tested. The results presented indicate that stitched laminates furnish damage tolerance performance comparable to that of more expensive, toughened-matrix composites.

  5. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  6. Probabilistic Assessment of Structural Seismic Damage for Buildings in Mid-America

    SciTech Connect

    Bai, Jong-Wha; Hueste, Mary Beth D.; Gardoni, Paolo

    2008-07-08

    This paper provides an approach to conduct a probabilistic assessment of structural damage due to seismic events with an application to typical building structures in Mid-America. The developed methodology includes modified damage state classifications based on the ATC-13 and ATC-38 damage states and the ATC-38 database of building damage. Damage factors are assigned to each damage state to quantify structural damage as a percentage of structural replacement cost. To account for the inherent uncertainties, these factors are expressed as random variables with a Beta distribution. A set of fragility curves, quantifying the structural vulnerability of a building, is mapped onto the developed methodology to determine the expected structural damage. The total structural damage factor for a given seismic intensity is then calculated using a probabilistic approach. Prediction and confidence bands are also constructed to account for the prevailing uncertainties. The expected seismic structural damage is assessed for a typical building structure in the Mid-America region using the developed methodology. The developed methodology provides a transparent procedure, where the structural damage factors can be updated as additional seismic damage data becomes available.

  7. Progressive Fracture and Damage Tolerance of Composite Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Gotsis, Pascal K.; Minnetyan, Levon

    1997-01-01

    Structural performance (integrity, durability and damage tolerance) of fiber reinforced composite pressure vessels, designed for pressured shelters for planetary exploration, is investigated via computational simulation. An integrated computer code is utilized for the simulation of damage initiation, growth, and propagation under pressure. Aramid fibers are considered in a rubbery polymer matrix for the composite system. Effects of fiber orientation and fabrication defect/accidental damages are investigated with regard to the safety and durability of the shelter. Results show the viability of fiber reinforced pressure vessels as damage tolerant shelters for planetary colonization.

  8. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  9. Impact damage tolerance of thin wall composite struts

    NASA Astrophysics Data System (ADS)

    Chen, G.-S.; Bidinger, G. M.; Lou, M. C.

    1993-04-01

    An experimental investigation was made to study the impact damage tolerance of thin wall composite struts made of both brittle epoxy and toughened epoxy based composite materials. Damage parameters such as barely visible surface damage and internal damage represented by the ultrasonic C-scan, and residual compressive strengths were evaluated against impact energy for two impactor sizes. From both a damage resistance (internal damage vs. impact energy) and a damage tolerance (residual compressive strength vs. internal damage) point of view, the toughened IM7/977-2 struts exhibited better performance than the brittle epoxy based T50/934 struts. This is attributed to the toughening mechanism in 977-2 which impedes delamination initiation from impact, and delamination growth and subsequent buckling under a compression loading. At barely visible damage thresholds, regardless of the impactor sizes, a maximum strength reduction of 45-55 percent was observed for the T50/934 struts, and approximately 10 percent for IM7/977-2 struts. This is of great interest for developing a damage tolerance design approach and risk assessment methodology in which the design allowable would be defined by the residual strength at the threshold of barely visible damage.

  10. Impact damage tolerance of thin wall composite struts

    NASA Technical Reports Server (NTRS)

    Chen, G.-S.; Bidinger, G. M.; Lou, M. C.

    1993-01-01

    An experimental investigation was made to study the impact damage tolerance of thin wall composite struts made of both brittle epoxy and toughened epoxy based composite materials. Damage parameters such as barely visible surface damage and internal damage represented by the ultrasonic C-scan, and residual compressive strengths were evaluated against impact energy for two impactor sizes. From both a damage resistance (internal damage vs. impact energy) and a damage tolerance (residual compressive strength vs. internal damage) point of view, the toughened IM7/977-2 struts exhibited better performance than the brittle epoxy based T50/934 struts. This is attributed to the toughening mechanism in 977-2 which impedes delamination initiation from impact, and delamination growth and subsequent buckling under a compression loading. At barely visible damage thresholds, regardless of the impactor sizes, a maximum strength reduction of 45-55 percent was observed for the T50/934 struts, and approximately 10 percent for IM7/977-2 struts. This is of great interest for developing a damage tolerance design approach and risk assessment methodology in which the design allowable would be defined by the residual strength at the threshold of barely visible damage.

  11. Some Examples of the Relations Between Processing and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2012-01-01

    Most structures made of laminated polymer matrix composites (PMCs) must be designed to some damage tolerance requirement that includes foreign object impact damage. Thus from the beginning of a part s life, impact damage is assumed to exist in the material and the part is designed to carry the required load with the prescribed impact damage present. By doing this, some processing defects may automatically be accounted for in the reduced design allowable due to these impacts. This paper will present examples of how a given level of impact damage and certain processing defects affect the compression strength of a laminate that contains both. Knowledge of the impact damage tolerance requirements, before processing begins, can broaden material options and processing techniques since the structure is not being designed to pristine properties.

  12. Damage tolerance and structural monitoring for wind turbine blades

    PubMed Central

    McGugan, M.; Pereira, G.; Sørensen, B. F.; Toftegaard, H.; Branner, K.

    2015-01-01

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  13. Damage tolerance and structural monitoring for wind turbine blades.

    PubMed

    McGugan, M; Pereira, G; Sørensen, B F; Toftegaard, H; Branner, K

    2015-02-28

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind.

  14. Damage tolerance and structural monitoring for wind turbine blades.

    PubMed

    McGugan, M; Pereira, G; Sørensen, B F; Toftegaard, H; Branner, K

    2015-02-28

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  15. Damage Tolerance Issues as Related to Metallic Rotorcraft Dynamic Components

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Elber, W.

    2005-01-01

    In this paper issues related to the use of damage tolerance in life managing rotorcraft dynamic components are reviewed. In the past, rotorcraft fatigue design has combined constant amplitude tests of full-scale parts with flight loads and usage data in a conservative manner to provide "safe life" component replacement times. In contrast to the safe life approach over the past twenty years the United States Air Force and several other NATO nations have used damage tolerance design philosophies for fixed wing aircraft to improve safety and reliability. The reliability of the safe life approach being used in rotorcraft started to be questioned shortly after presentations at an American Helicopter Society's specialist meeting in 1980 showed predicted fatigue lives for a hypothetical pitch-link problem to vary from a low of 9 hours to a high in excess of 2594 hours. This presented serious cost, weight, and reliability implications. Somewhat after the U.S. Army introduced its six nines reliability on fatigue life, attention shifted towards using a possible damage tolerance approach to the life management of rotorcraft dynamic components. The use of damage tolerance in life management of dynamic rotorcraft parts will be the subject of this paper. This review will start with past studies on using damage tolerance life management with existing helicopter parts that were safe life designed. Also covered will be a successful attempt at certifying a tail rotor pitch rod using damage tolerance, which was designed using the safe life approach. The FAA review of rotorcraft fatigue design and their recommendations along with some on-going U.S. industry research in damage tolerance on rotorcraft will be reviewed. Finally, possible problems and future needs for research will be highlighted.

  16. Damage Tolerance Issues as Related to Metallic Rotorcraft Dynamic Components

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Elber, W.

    1999-01-01

    In this paper issues related to the use of damage tolerance in life managing rotorcraft dynamic components are reviewed. In the past, rotorcraft fatigue design has combined constant amplitude tests of full-scale parts with flight loads and usage data in a conservative manner to provide "safe life" component replacement times. In contrast to the safe life approach over the past twenty years the United States Air Force and several other NATO nations have used damage tolerance design philosophies for fixed wing aircraft to improve safety and reliability. The reliability of the safe life approach being used in rotorcraft started to be questioned shortly after presentations at an American Helicopter Society's specialist meeting in 1980 showed predicted fatigue lives for a hypothetical pitch-link problem to vary from a low of 9 hours to a high in excess of 2594 hours. This presented serious cost, weight, and reliability implications. Somewhat after the U.S. Army introduced its six nines reliability on fatigue life, attention shifted towards using a possible damage tolerance approach to the life management of rotorcraft dynamic components. The use of damage tolerance in life management of dynamic rotorcraft parts will be the subject of this paper. This review will start with past studies on using damage tolerance life management with existing helicopter parts that were safe life designed. Also covered will be a successful attempt at certifying a tail rotor pitch rod using damage tolerance, which was designed using the safe life approach. The FAA review of rotorcraft fatigue design and their recommendations along with some on-going U.S. industry research in damage tolerance on rotorcraft will be reviewed.

  17. Social-Stratification Probabilistic Routing Algorithm in Delay-Tolerant Network

    NASA Astrophysics Data System (ADS)

    Alnajjar, Fuad; Saadawi, Tarek

    Routing in mobile ad hoc networks (MANET) is complicated due to the fact that the network graph is episodically connected. In MANET, topology is changing rapidly because of weather, terrain and jamming. A key challenge is to create a mechanism that can provide good delivery performance and low end-to-end delay in an intermittent network graph where nodes may move freely. Delay-Tolerant Networking (DTN) architecture is designed to provide communication in intermittently connected networks, by moving messages towards destination via ”store, carry and forward” technique that supports multi-routing algorithms to acquire best path towards destination. In this paper, we propose the use of probabilistic routing in DTN architecture using the concept of social-stratification network. We use the Opportunistic Network Environment (ONE) simulator as a simulation tool to compare the proposed Social- stratification Probabilistic Routing Algorithm (SPRA) with the common DTN-based protocols. Our results show that SPRA outperforms the other protocols.

  18. On the enhancement of impact damage tolerance of composite laminates

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D. G.

    1993-01-01

    This paper examines the use of a thin layer of Ultra High Molecular Weight Polyethylene (UHMWPE) on the outer surface of carbon/epoxy composite materials as a method of improving impact resistance and damage tolerance through hybridization. Flat 16-ply laminates as well as honeycomb sandwich structures with eight-ply facesheets were tested in this study. Instrumented drop-weight impact testing was used to inflict damage upon the specimens. Evaluation of damage resistance included instrumented impact data, visual examination, C-scanning and compression after impact (CAI) testing. The results show that only one lamina of UHMWPE did not improve the damage tolerance (strength retention) of the 16-ply flat laminate specimens or the honeycomb sandwich beams, however, a modest gain in impact resistance (detectable damage) was found for the honeycomb sandwich specimens that contained an outer layer of UHMWPE.

  19. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  20. Ontogenetic contingency of tolerance mechanisms in response to apical damage

    PubMed Central

    Gruntman, Michal; Novoplansky, Ariel

    2011-01-01

    Background and Aims Plants are able to tolerate tissue loss through vigorous branching which is often triggered by release from apical dominance and activation of lateral meristems. However, damage-induced branching might not be a mere physiological outcome of released apical dominance, but an adaptive response to environmental signals, such as damage timing and intensity. Here, branching responses to both factors were examined in the annual plant Medicago truncatula. Methods Branching patterns and allocation to reproductive traits were examined in response to variable clipping intensities and timings in M. truncatula plants from two populations that vary in the onset of reproduction. Phenotypic selection analysis was used to evaluate the strength and direction of selection on branching under the damage treatments. Key Results Plants of both populations exhibited an ontogenetic shift in tolerance mechanisms: while early damage induced greater meristem activation, late damage elicited investment in late-determined traits, including mean pod and seed biomass, and supported greater germination rates. Severe damage mostly elicited simultaneous development of multiple-order lateral branches, but this response was limited to early damage. Selection analyses revealed positive directional selection on branching in plants under early- compared with late- or no-damage treatments. Conclusions The results demonstrate that damage-induced meristem activation is an adaptive response that could be modified according to the plant's developmental stage, severity of tissue loss and their interaction, stressing the importance of considering these effects when studying plastic responses to apical damage. PMID:21873259

  1. Mechanical Data for Use in Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; James, Mark A.; Newman, John A.; Everett, Richard A., Jr.; Johnston, William M., Jr.

    2004-01-01

    This report describes the results of a research program to determine the damage tolerance properties of metallic propeller materials. Three alloys were selected for investigation: 2025-T6 Aluminum, D6AC Steel and 4340 Steel. Mechanical response, fatigue (S-N) and fatigue crack growth rate data are presented for all of the alloys. The main conclusions that can be drawn from this study are as follows. The damage tolerant design of a propeller system will require a complete understanding of the fatigue crack growth threshold. There exists no experimental procedure to reliably develop the fatigue crack growth threshold data that is needed for damage tolerant design methods. Significant research will be required to fully understand the fatigue crack growth threshold. The development of alternative precracking methods, evaluating the effect of specimen configuration and attempting to identify micromechanical issues are simply the first steps to understanding the mechanics of the threshold.

  2. Rapid Damage eXplorer (RDX): A Probabilistic Framework for Learning Changes From Bitemporal Images

    SciTech Connect

    Vatsavai, Raju

    2012-01-01

    Recent decade has witnessed major changes on the Earth, for example, deforestation, varying cropping and human settlement patterns, and crippling damages due to disasters. Accurate damage assessment caused by major natural and anthropogenic disasters is becoming critical due to increases in human and economic loss. This increase in loss of life and severe damages can be attributed to the growing population, as well as human migration to the disaster prone regions of the world. Rapid assessment of these changes and dissemination of accurate information is critical for creating an effective emergency response. Change detection using high-resolution satellite images is a primary tool in assessing damages, monitoring biomass and critical infrastructures, and identifying new settlements. In this demo, we present a novel supervised probabilistic framework for identifying changes using very high-resolution multispectral, and bitemporal remote sensing images. Our demo shows that the rapid damage explorer (RDX) system is resilient to registration errors and differing sensor characteristics.

  3. An Experimental Investigation of Damage Resistances and Damage Tolerance of Composite Materials

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    2003-01-01

    The project included three lines of investigation, aimed at a better understanding of the damage resistance and damage tolerance of pultruded composites. The three lines of investigation were: (i) measurement of permanent dent depth after transverse indentation at different load levels, and correlation with other damage parameters such as damage area (from x-radiography) and back surface crack length, (ii) estimation of point stress and average stress characteristic dimensions corresponding to measured damage parameters, and (iii) an attempt to measure the damage area by a reflection photoelastic technique. All the three lines of investigation were pursued.

  4. Damage tolerant composite wing panels for transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, Peter J.; Wilson, Robert D.; Gibbins, M. N.

    1985-01-01

    Commercial aircraft advanced composite wing surface panels were tested for durability and damage tolerance. The wing of a fuel-efficient, 200-passenger airplane for 1990 delivery was sized using grahite-epoxy materials. The damage tolerance program was structured to allow a systematic progression from material evaluations to the optimized large panel verification tests. The program included coupon testing to evaluate toughened material systems, static and fatigue tests of compression coupons with varying amounts of impact damage, element tests of three-stiffener panels to evaluate upper wing panel design concepts, and the wing structure damage environment was studied. A series of technology demonstration tests of large compression panels is performed. A repair investigation is included in the final large panel test.

  5. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  6. Life assessment and damage tolerance of wind turbines

    NASA Astrophysics Data System (ADS)

    Wanhill, R. J. H.

    1983-11-01

    Safe and durable operation of fatigue critical structures in high technology windmills, including safe life assessment and possible application of damage tolerance principles was surveyed. A research program to assist safe and durable operation of windmill rotors in the Netherlands is reviewed.

  7. 75 FR 11734 - Damage Tolerance Data for Repairs and Alterations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... make damage tolerance data for repairs and alterations to fatigue critical airplane structure available... that affect fatigue critical baseline structure. Operators, therefore, will have the DT data for TC... critical alteration structure for alteration data approved on or after January 11, 2008. This change...

  8. Heat tolerance of higher plants cenosis to damaging air temperatures

    NASA Astrophysics Data System (ADS)

    Ushakova, Sofya; Shklavtsova, Ekaterina

    Designing sustained biological-technical life support systems (BTLSS) including higher plants as a part of a photosynthesizing unit, it is important to foresee the multi species cenosis reaction on either stress-factors. Air temperature changing in BTLSS (because of failure of a thermoregulation system) up to the values leading to irreversible damages of photosynthetic processes is one of those factors. However, it is possible to increase, within the certain limits, the plant cenosis tolerance to the unfavorable temperatures’ effect due to the choice of the higher plants possessing resistance both to elevated and to lowered air temperatures. Besides, the plants heat tolerance can be increased when subjecting them during their growing to the hardening off temperatures’ effect. Thus, we have come to the conclusion that it is possible to increase heat tolerance of multi species cenosis under the damaging effect of air temperature of 45 (°) СC.

  9. A Computationally-Efficient Inverse Approach to Probabilistic Strain-Based Damage Diagnosis

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Hochhalter, Jacob D.; Leser, William P.; Leser, Patrick E.; Newman, John A

    2016-01-01

    This work presents a computationally-efficient inverse approach to probabilistic damage diagnosis. Given strain data at a limited number of measurement locations, Bayesian inference and Markov Chain Monte Carlo (MCMC) sampling are used to estimate probability distributions of the unknown location, size, and orientation of damage. Substantial computational speedup is obtained by replacing a three-dimensional finite element (FE) model with an efficient surrogate model. The approach is experimentally validated on cracked test specimens where full field strains are determined using digital image correlation (DIC). Access to full field DIC data allows for testing of different hypothetical sensor arrangements, facilitating the study of strain-based diagnosis effectiveness as the distance between damage and measurement locations increases. The ability of the framework to effectively perform both probabilistic damage localization and characterization in cracked plates is demonstrated and the impact of measurement location on uncertainty in the predictions is shown. Furthermore, the analysis time to produce these predictions is orders of magnitude less than a baseline Bayesian approach with the FE method by utilizing surrogate modeling and effective numerical sampling approaches.

  10. Optimization of Aerospace Structure Subject to Damage Tolerance Criteria

    NASA Technical Reports Server (NTRS)

    Akgun, Mehmet A.

    1999-01-01

    The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers. It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages. A common method for topology optimization is that of compliance minimization which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local and represents a small change in the stiffness matrix compared to the baseline (undamaged

  11. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1998-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  12. Damage Tolerance and Reliability of Turbine Engine Components

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1999-01-01

    A formal method is described to quantify structural damage tolerance and reliability in the presence of multitude of uncertainties in turbine engine components. The method is based at the materials behaviour level where primitive variables with their respective scatters are used to describe the behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from these methods demonstrate that the methods are mature and that they can be used for future strategic projections and planning to assure better, cheaper, faster, products for competitive advantages in world markets. These results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  13. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    DOE PAGESBeta

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from highmore » density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.« less

  14. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    PubMed Central

    Chen, Y.; Yu, K Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-01-01

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials. PMID:25906997

  15. The research and development of damage tolerant carbon fiber composites

    NASA Astrophysics Data System (ADS)

    Miranda, John Armando

    This record of study takes a first hand look at corporate research and development efforts to improve the damage tolerance of two unique composite materials used in high performance aerospace applications. The professional internship with The Dow Chemical Company---Dow/United Technologies joint venture describes the intern's involvement in developing patentable process technologies for interleave toughening of high temperature resins and their composites. The subsequent internship with Hexcel Corporation describes the intern's involvement in developing the damage tolerance of novel and existing honeycomb sandwich structure technologies. Through the Doctor of Engineering professional internship experience this student exercised fundamental academic understanding and methods toward accomplishing the corporate objectives of the internship sponsors in a resource efficient and cost-effective manner. Also, the student gained tremendous autonomy through exceptional training in working in focused team environments with highly trained engineers and scientists in achieving important corporate objectives.

  16. Fatigue Crack Growth Database for Damage Tolerance Analysis

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Cardinal, J. W.; Williams, L. C.; McKeighan, P. C.

    2005-01-01

    The objective of this project was to begin the process of developing a fatigue crack growth database (FCGD) of metallic materials for use in damage tolerance analysis of aircraft structure. For this initial effort, crack growth rate data in the NASGRO (Registered trademark) database, the United States Air Force Damage Tolerant Design Handbook, and other publicly available sources were examined and used to develop a database that characterizes crack growth behavior for specific applications (materials). The focus of this effort was on materials for general commercial aircraft applications, including large transport airplanes, small transport commuter airplanes, general aviation airplanes, and rotorcraft. The end products of this project are the FCGD software and this report. The specific goal of this effort was to present fatigue crack growth data in three usable formats: (1) NASGRO equation parameters, (2) Walker equation parameters, and (3) tabular data points. The development of this FCGD will begin the process of developing a consistent set of standard fatigue crack growth material properties. It is envisioned that the end product of the process will be a general repository for credible and well-documented fracture properties that may be used as a default standard in damage tolerance analyses.

  17. Elastic properties, strength and damage tolerance of pultruded composites

    NASA Astrophysics Data System (ADS)

    Saha, Mrinal Chandra

    Pultruded composites are candidate materials for civil engineering infrastructural applications due their higher corrosion resistance and lower life cycle cost. Efficient use of materials like structural members requires thorough understanding of the mechanism that affects their response. The present investigation addresses the modeling and characterization of E-glass fiber/polyester resin matrix pultruded composites in the form of sheets of various thicknesses. The elastic constants were measured using static, vibration and ultrasonic methods. Two types of piezoelectric crystals were used in ultrasonic measurements. Finally, the feasibility of using a single specimen, in the form of a circular disk, was shown in measuring all the elastic constants using ultrasonic technique. The effects of stress gradient on tensile strength were investigated. A large number of specimens, parallel and transverse to the pultrusion direction, were tested in tension, 3-point flexure, and 4-point flexure. A 2-parameter Weibull model was applied to predict the tensile strength from the flexure tests. The measured and Weibull-predicted ratios did not show consistent agreement. Microstructural observations suggested that the flaw distribution in the material was not uniform, which appears to be a basic requirement for the Weibull distribution. Compressive properties were measured using a short-block compression test specimen of 44.4-mm long and 25.4-mm wide. Specimens were tested at 0°, 30°, 45°, 60° and 90° orientations. The compression test specimen was modeled using 4-noded isoparametric layered plate and shell elements. The predicted elastic properties for the roving layer and the continuous strand mat layer was used for the finite element study. The damage resistance and damage tolerance were investigated experimentally. Using a quasi-static indentation loading, damage was induced at various incrementally increased force levels to investigate the damage growth process. Damage

  18. Quantifying grain boundary damage tolerance with atomistic simulations

    NASA Astrophysics Data System (ADS)

    Foley, Daniel; Tucker, Garritt J.

    2016-10-01

    Grain boundaries play a pivotal role in defect evolution and accommodation within materials. Irradiated metals have been observed to form defect denuded zones in the vicinity of grain boundaries. This is especially apparent in nanocrystalline metals, which have an increased grain boundary concentration, as compared to their polycrystalline counterparts. Importantly, the effect of individual grain boundaries on microstructural damage tolerance is related to the character or structural state of the grain boundary. In this work, the damage accommodation behavior of a variety of copper grain boundaries is studied using atomistic simulations. Damage accumulation behavior is found to reach a saturation point where both the free volume and energy of a grain boundary fluctuate within an elliptical manifold, which varies in size for different boundary characters. Analysis of the grain boundaries shows that extrinsic damage accommodation occurs due to localized atomic shuffling accompanied by free volume rearrangement within the boundary. Continuous damage accumulation leads to altered atomic structural states that oscillate around a mean non-equilibrium state, that is energetically metastable. Our results suggest that variation of grain boundary behavior, both from equilibrium and under saturation, is directly related to grain boundary equilibrium energy and some boundaries have a greater propensity to continually accommodate damage, as compared to others.

  19. A preliminary damage tolerance methodology for composite structures

    NASA Technical Reports Server (NTRS)

    Wilkins, D. J.

    1983-01-01

    The certification experience for the primary, safety-of-flight composite structure applications on the F-16 is discussed. The rationale for the selection of delamination as the major issue for damage tolerance is discussed, as well as the modeling approach selected. The development of the necessary coupon-level data base is briefly summarized. The major emphasis is on the description of a full-scale fatigue test where delamination growth was obtained to demonstrate the validity of the selected approach. A summary is used to review the generic features of the methodology.

  20. DNA damage tolerance by recombination: Molecular pathways and DNA structures.

    PubMed

    Branzei, Dana; Szakal, Barnabas

    2016-08-01

    Replication perturbations activate DNA damage tolerance (DDT) pathways, which are crucial to promote replication completion and to prevent fork breakage, a leading cause of genome instability. One mode of DDT uses translesion synthesis polymerases, which however can also introduce mutations. The other DDT mode involves recombination-mediated mechanisms, which are generally accurate. DDT occurs prevalently postreplicatively, but in certain situations homologous recombination is needed to restart forks. Fork reversal can function to stabilize stalled forks, but may also promote error-prone outcome when used for fork restart. Recent years have witnessed important advances in our understanding of the mechanisms and DNA structures that mediate recombination-mediated damage-bypass and highlighted principles that regulate DDT pathway choice locally and temporally. In this review we summarize the current knowledge and paradoxes on recombination-mediated DDT pathways and their workings, discuss how the intermediate DNA structures may influence genome integrity, and outline key open questions for future research. PMID:27236213

  1. Review of the Oconee-3 probabilistic risk assessment: external events, core damage frequency. Volume 2

    SciTech Connect

    Hanan, N.A.; Ilberg, D.; Xue, D.; Youngblood, R.; Reed, J.W.; McCann, M.; Talwani, T.; Wreathall, J.; Kurth, P.D.; Bandyopadhyay, K.

    1986-03-01

    A review of the Oconee-3 Probabilistic Risk Assessment (OPRA) was conducted with the broad objective of evaluating qualitatively and quantitatively (as much as possible) the OPRA assessment of the important sequences that are ''externally'' generated and lead to core damage. The review included a technical assessment of the assumptions and methods used in the OPRA within its stated objective and with the limited information available. Within this scope, BNL performed a detailed reevaluation of the accident sequences generated by internal floods and earthquakes and a less detailed review (in some cases a scoping review) for the accident sequences generated by fires, tornadoes, external floods, and aircraft impact. 12 refs., 24 figs., 31 tabs.

  2. Damage-Tolerant Fan Casings for Jet Engines

    NASA Technical Reports Server (NTRS)

    2006-01-01

    All turbofan engines work on the same principle. A large fan at the front of the engine draws air in. A portion of the air enters the compressor, but a greater portion passes on the outside of the engine this is called bypass air. The air that enters the compressor then passes through several stages of rotating fan blades that compress the air more, and then it passes into the combustor. In the combustor, fuel is injected into the airstream, and the fuel-air mixture is ignited. The hot gasses produced expand rapidly to the rear, and the engine reacts by moving forward. If there is a flaw in the system, such as an unexpected obstruction, the fan blade can break, spin off, and harm other engine components. Fan casings, therefore, need to be strong enough to contain errant blades and damage-tolerant to withstand the punishment of a loose blade-turned-projectile. NASA has spearheaded research into improving jet engine fan casings, ultimately discovering a cost-effective approach to manufacturing damage-tolerant fan cases that also boast significant weight reduction. In an aircraft, weight reduction translates directly into fuel burn savings, increased payload, and greater aircraft range. This technology increases safety and structural integrity; is an attractive, viable option for engine manufacturers, because of the low-cost manufacturing; and it is a practical alternative for customers, as it has the added cost saving benefits of the weight reduction.

  3. Damage-tolerant nanotwinned metals with nanovoids under radiation environments

    SciTech Connect

    Chen, Y.; Yu, K. Y.; Liu, Y.; Shao, S.; Wang, H.; Kirk, M. A.; Wang, J.; Zhang, X.

    2015-04-24

    Material performance in extreme radiation environments is central to the design of future nuclear reactors. Radiation induces significant damage in the form of dislocation loops and voids in irradiated materials, and continuous radiation often leads to void growth and subsequent void swelling in metals with low stacking fault energy. Here we show that by using in situ heavy ion irradiation in a transmission electron microscope, pre-introduced nanovoids in nanotwinned Cu efficiently absorb radiation-induced defects accompanied by gradual elimination of nanovoids, enhancing radiation tolerance of Cu. In situ studies and atomistic simulations reveal that such remarkable self-healing capability stems from high density of coherent and incoherent twin boundaries that rapidly capture and transport point defects and dislocation loops to nanovoids, which act as storage bins for interstitial loops. This study describes a counterintuitive yet significant concept: deliberate introduction of nanovoids in conjunction with nanotwins enables unprecedented damage tolerance in metallic materials.

  4. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  5. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1988-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  6. Towards a damage tolerance philosophy for composite materials and structures

    NASA Technical Reports Server (NTRS)

    O'Brien, T. Kevin

    1990-01-01

    A damage-threshold/fail-safe approach is proposed to ensure that composite structures are both sufficiently durable for economy of operation, as well as adequately fail-safe or damage tolerant for flight safety. Matrix cracks are assumed to exist throughout the off-axis plies. Delamination onset is predicted using a strain energy release rate characterization. Delamination growth is accounted for in one of three ways: either analytically, using delamination growth laws in conjunction with strain energy release rate analyses incorporating delamination resistance curves; experimentally, using measured stiffness loss; or conservatively, assuming delamination onset corresponds to catastrophic delamination growth. Fail-safety is assessed by accounting for the accumulation of delaminations through the thickness. A tension fatigue life prediction for composite laminates is presented as a case study to illustrate how this approach may be implemented. Suggestions are made for applying the damage-threshold/fail-safe approach to compression fatigue, tension/compression fatigue, and compression strength following low velocity impact.

  7. Durability and Damage Tolerance of High Temperature Polymeric Composites

    NASA Technical Reports Server (NTRS)

    Case, Scott W.; Reifsnider, Kenneth L.

    1996-01-01

    Modern durability and damage tolerance predictions for composite material systems rely on accurate estimates of the local stress and material states for each of the constituents, as well as the manner in which the constituents interact. In this work, an number of approaches to estimating the stress states and interactions are developed. First, an elasticity solution is presented for the problem of a penny-shaped crack in an N-phase composite material system opened by a prescribed normal pressure. The stress state around such a crack is then used to estimate the stress concentrations due to adjacent fiber fractures in composite materials. The resulting stress concentrations are then used to estimate the tensile strength of the composite. The predicted results are compared with experimental values. In addition, a cumulative damage model for fatigue is presented. Modifications to the model are made to include the effects of variable amplitude loading. These modifications are based upon the use of remaining strength as a damage metric and the definition of an equivalent generalized time. The model is initially validated using results from the literature. Also, experimental data from APC-2 laminates and IM7/K3B laminates are used in the model. The use of such data for notched laminates requires the use of an effective hole size, which is calculated based upon strain distribution measurements. Measured remaining strengths after fatigue loading are compared with the predicted values for specimens fatigued at room temperature and 350 F (177 C).

  8. Fuel containment and damage tolerance for large composite primary aircraft structures. Phase 1: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.

    1983-01-01

    Technical problems associated with fuel containment and damage tolerance of composite material wings for transport aircraft were identified. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The test results, the test techniques, and the test data are presented.

  9. Design, testing, and damage tolerance study of bonded stiffened composite wing cover panels

    NASA Technical Reports Server (NTRS)

    Madan, Ram C.; Sutton, Jason O.

    1988-01-01

    Results are presented from the application of damage tolerance criteria for composite panels to multistringer composite wing cover panels developed under NASA's Composite Transport Wing Technology Development contract. This conceptual wing design integrated aeroelastic stiffness constraints with an enhanced damage tolerance material system, in order to yield optimized producibility and structural performance. Damage tolerance was demonstrated in a test program using full-sized cover panel subcomponents; panel skins were impacted at midbay between stiffeners, directly over a stiffener, and over the stiffener flange edge. None of the impacts produced visible damage. NASTRAN analyses were performed to simulate NDI-detected invisible damage.

  10. The combined effect of glass buffer strips and stitching on the damage tolerance of composites

    NASA Technical Reports Server (NTRS)

    Kullerd, Susan M.

    1993-01-01

    Recent research has demonstrated that through-the-thickness stitching provides major improvements in the damage tolerance of composite laminates loaded in compression. However, the brittle nature of polymer matrix composites makes them susceptible to damage propagation, requiring special material applications and designs to limit damage growth. Glass buffer strips, embedded within laminates, have shown the potential for improving the damage tolerance of unstitched composite laminates loaded in tension. The glass buffer strips, less stiff than the surrounding carbon fibers, arrest crack growth in composites under tensile loads. The present study investigates the damage tolerance characteristics of laminates that contain both stitching and glass buffer strips.

  11. The effect of resin on the impact damage tolerance of graphite-epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1981-01-01

    The effect of the matrix resin on the impact damage tolerance of graphite-epoxy composite laminates was investigated. The materials were evaluated on the basis of the damage incurred due to local impact and on their ability to retain compression strength in the presence of impact damage. Twenty-four different resin systems were evaluated. Five of the systems demonstrated substantial improvements compared to the baseline system including retention of compression strength in the presence of impact damage. Examination of the neat resin mechanical properties indicates the resin tensile properties influence significantly the laminate damage tolerance and that improvements in laminate damage tolerance are not necessarily made at the expense of room temperature mechanical properties. Preliminary results indicate a resin volume fraction on the order of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  12. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage Tolerance and Fatigue Evaluation of... Requirements Fatigue Evaluation § 29.573 Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft... practice, the applicant must do a fatigue evaluation in accordance with paragraph (e) of this section....

  13. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Damage Tolerance and Fatigue Evaluation of... Requirements Fatigue Evaluation § 27.573 Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft... practice, the applicant must do a fatigue evaluation in accordance with paragraph (e) of this section....

  14. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Metallic damage tolerance and fatigue... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue... evaluation of the strength, detail design, and fabrication must show that catastrophic failure due to...

  15. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance....

  16. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION... COMMUTER CATEGORY AIRPLANES Structure Fatigue Evaluation § 23.574 Metallic damage tolerance and fatigue evaluation of commuter category airplanes. For commuter category airplanes— (a) Metallic damage tolerance....

  17. Non-probabilistic information fusion technique for structural damage identification based on measured dynamic data with uncertainty

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Jun; Yang, Chen; Qiu, Zhi-Ping

    2013-04-01

    Based on measured natural frequencies and acceleration responses, a non-probabilistic information fusion technique is proposed for the structural damage detection by adopting the set-membership identification (SMI) and two-step model updating procedure. Due to the insufficiency and uncertainty of information obtained from measurements, the uncertain problem of damage identification is addressed with interval variables in this paper. Based on the first-order Taylor series expansion, the interval bounds of the elemental stiffness parameters in undamaged and damaged models are estimated, respectively. The possibility of damage existence (PoDE) in elements is proposed as the quantitative measure of structural damage probability, which is more reasonable in the condition of insufficient measurement data. In comparison with the identification method based on a single kind of information, the SMI method will improve the accuracy in damage identification, which reflects the information fusion concept based on the non-probabilistic set. A numerical example is performed to demonstrate the feasibility and effectiveness of the proposed technique.

  18. Structurally Integrated, Damage-Tolerant, Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew; Dwivedi, Gopal; Sampath, Sanjay

    2015-07-01

    Thermal spray coatings are used extensively for the protection and life extension of engineering components exposed to harsh wear and/or corrosion during service in aerospace, energy, and heavy machinery sectors. Cermet coatings applied via high-velocity thermal spray are used in aggressive wear situations almost always coupled with corrosive environments. In several instances (e.g., landing gear), coatings are considered as part of the structure requiring system-level considerations. Despite their widespread use, the technology has lacked generalized scientific principles for robust coating design, manufacturing, and performance analysis. Advances in process and in situ diagnostics have provided significant insights into the process-structure-property-performance correlations providing a framework-enhanced design. In this overview, critical aspects of materials, process, parametrics, and performance are discussed through exemplary studies on relevant compositions. The underlying connective theme is understanding and controlling residual stresses generation, which not only addresses process dynamics but also provides linkage for process-property relationship for both the system (e.g., fatigue) and the surface (wear and corrosion). The anisotropic microstructure also invokes the need for damage-tolerant material design to meet future goals.

  19. Damage Tolerance Behavior of Friction Stir Welds in Aluminum Alloys

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of various aerospace structures. Self-reacting and conventional friction stir welding are variations of the friction stir weld process employed in the fabrication of cryogenic propellant tanks which are classified as pressurized structure in many spaceflight vehicle architectures. In order to address damage tolerance behavior associated with friction stir welds in these safety critical structures, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data describing fracture behavior, residual strength capability, and cyclic mission life capability of friction stir welds at ambient and cryogenic temperatures have been generated and will be presented in this paper. Fracture behavior will include fracture toughness and tearing (R-curve) response of the friction stir welds. Residual strength behavior will include an evaluation of the effects of lack of penetration on conventional friction stir welds, the effects of internal defects (wormholes) on self-reacting friction stir welds, and an evaluation of the effects of fatigue cycled surface cracks on both conventional and selfreacting welds. Cyclic mission life capability will demonstrate the effects of surface crack defects on service load cycle capability. The fracture data will be used to evaluate nondestructive inspection and proof test requirements for the welds.

  20. Damage Tolerance Assessment of Friction Pull Plug Welds

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process developed and patented by The Welding Institute in Cambridge, England. Friction stir welding has been implemented in the aerospace industry in the fabrication of longitudinal welds in pressurized cryogenic propellant tanks. As the industry looks to implement friction stir welding in circumferential welds in pressurized cryogenic propellant tanks, techniques to close out the termination hole associated with retracting the pin tool are being evaluated. Friction pull plug welding is under development as a one means of closing out the termination hole. A friction pull plug weld placed in a friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite, plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size in the test or service environments. Test data relating residual strength capability to flaw size in two aluminum alloy friction plug weld configurations is presented.

  1. Recent Advances in Durability and Damage Tolerance Methodology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ransom, J. B.; Glaessgen, E. H.; Raju, I. S.; Harris, C. E.

    2007-01-01

    Durability and damage tolerance (D&DT) issues are critical to the development of lighter, safer and more efficient aerospace vehicles. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. Both D&DT methodologies must address the deleterious effects of changes in material properties and the initiation and growth of damage that may occur during the vehicle s service lifetime. The result of unanticipated D&DT response is often manifested in the form of catastrophic and potentially fatal accidents. As such, durability and damage tolerance requirements must be rigorously addressed for commercial transport aircraft and NASA spacecraft systems. This paper presents an overview of the recent and planned future research in durability and damage tolerance analytical and experimental methods for both metallic and composite aerospace structures at NASA Langley Research Center (LaRC).

  2. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  3. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  4. 14 CFR 29.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  5. 14 CFR 27.573 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... effects of material and process variability along with environmental conditions in the strength and..., DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Strength... intervals of the rotorcraft by performing damage tolerance evaluations of the strength of composite PSEs...

  6. Concepts for improving the damage tolerance of composite compression panels. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Rhodes, M. D.; Williams, J. G.

    1984-01-01

    The residual strength of specimens with damage and the sensitivity to damage while subjected to an applied inplane compression load were determined for flatplate specimens and blade-stiffened panels. The results suggest that matrix materials that fail by delamination have the lowest damage tolerance capability. Alternate matrix materials or laminates which are transversely reinforced suppress the delamination mode of failure and change the failure mode to transverse shear crippling which occurs at a higher strain value. Several damage-tolerant blade-stiffened panel design concepts are evaluated. Structural efficiency studies conducted show only small mass penalties may result from incorporating these damage-tolerant features in panel design. The implication of test results on the design of aircraft structures was examined with respect to FAR requirements.

  7. ADVANCED COMPOSITE WIND TURBINE BLADE DESIGN BASED ON DURABILITY AND DAMAGE TOLERANCE

    SciTech Connect

    Galib Abumeri; Frank Abdi

    2012-02-16

    damage and fracture modes that resemble those reported in the tests. The results show that computational simulation can be relied on to enhance the design of tapered composite structures such as the ones used in turbine wind blades. A computational simulation for durability, damage tolerance (D&DT) and reliability of composite wind turbine blade structures in presence of uncertainties in material properties was performed. A composite turbine blade was first assessed with finite element based multi-scale progressive failure analysis to determine failure modes and locations as well as the fracture load. D&DT analyses were then validated with static test performed at Sandia National Laboratories. The work was followed by detailed weight analysis to identify contribution of various materials to the overall weight of the blade. The methodology ensured that certain types of failure modes, such as delamination progression, are contained to reduce risk to the structure. Probabilistic analysis indicated that composite shear strength has a great influence on the blade ultimate load under static loading. Weight was reduced by 12% with robust design without loss in reliability or D&DT. Structural benefits obtained with the use of enhanced matrix properties through nanoparticles infusion were also assessed. Thin unidirectional fiberglass layers enriched with silica nanoparticles were applied to the outer surfaces of a wind blade to improve its overall structural performance and durability. The wind blade was a 9-meter prototype structure manufactured and tested subject to three saddle static loading at Sandia National Laboratory (SNL). The blade manufacturing did not include the use of any nano-material. With silica nanoparticles in glass composite applied to the exterior surfaces of the blade, the durability and damage tolerance (D&DT) results from multi-scale PFA showed an increase in ultimate load of the blade by 9.2% as compared to baseline structural performance (without nano

  8. Strong, damage tolerant oxide-fiber/oxide matrix composites

    NASA Astrophysics Data System (ADS)

    Bao, Yahua

    cationic polyelectrolytes to have a positive surface charge and then dipped into diluted, negatively-charged AlPO4 colloidal suspension (0.05M) at pH 7.5. Amorphous AlPO4 (crystallizes to tridymite- and cristobalite-forms at 1080°C) nano particles were coated on fibers layer-by-layer using an electrostatic attraction protocol. A uniform and smooth coating was formed which allowed fiber pullout from the matrix of a Nextel 720/alumina mini-composite hot-pressed at 1250°C/20MPa. Reaction-bonded mullite (RBM), with low formation temperature and sintering shrinkage was synthesized by incorporation of mixed-rare-earth-oxide (MREO) and mullite seeds. Pure mullite formed with 7.5wt% MREO at 1300°C. Introduction of 5wt% mullite seeds gave RBM with less than 3% shrinkage and 20% porosity. AlPO4-coated Nextel 720/RBM composites were successful fabricated by EPID and pressureless sintering at 1300°C. Significant fiber pullout occurred and the 4-point bend strength was around 170MPa (with 25-30vol% fibers) at room temperature and 1100°C and a Work-of-Fracture 7KJ/m2. At 1200°C, the composite failed in shear due to the MREO-based glassy phase in the matrix. AlPO4-coated Nextel 720 fiber/aluminosilicate (no MREO) showed damage tolerance at 1200°C with a bend strength 170MPa.

  9. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Metallic damage tolerance and fatigue evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION..., corrosion, defects, or damage will be avoided throughout the operational life of the airplane....

  10. 14 CFR 23.574 - Metallic damage tolerance and fatigue evaluation of commuter category airplanes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Metallic damage tolerance and fatigue evaluation of commuter category airplanes. 23.574 Section 23.574 Aeronautics and Space FEDERAL AVIATION..., corrosion, defects, or damage will be avoided throughout the operational life of the airplane....

  11. Use of a New Portable Instrumented Impactor on the NASA Composite Crew Module Damage Tolerance Program

    NASA Technical Reports Server (NTRS)

    Jackson, Wade C.; Polis, Daniel L.

    2014-01-01

    Damage tolerance performance is critical to composite structures because surface impacts at relatively low energies may result in a significant strength loss. For certification, damage tolerance criteria require aerospace vehicles to meet design loads while containing damage at critical locations. Data from standard small coupon testing are difficult to apply to larger more complex structures. Due to the complexity of predicting both the impact damage and the residual properties, damage tolerance is demonstrated primarily by testing. A portable, spring-propelled, impact device was developed which allows the impact damage response to be investigated on large specimens, full-scale components, or entire vehicles. During impact, both the force history and projectile velocity are captured. The device was successfully used to demonstrate the damage tolerance performance of the NASA Composite Crew Module. The impactor was used to impact 18 different design features at impact energies up to 35 J. Detailed examples of these results are presented, showing impact force histories, damage inspection results, and response to loading.

  12. 76 FR 74655 - Damage Tolerance and Fatigue Evaluation of Composite Rotorcraft Structures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... static strength of composite rotorcraft structures using a damage tolerance evaluation, or a fatigue... also harmonize this standard with international standards for evaluating the fatigue strength of normal... damage and loading conditions. This rule addresses the unique characteristics of composite materials...

  13. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... material variability and environmental conditions on the strength and durability properties of the... established that ensure the strength of each joint. (6) Structural components for which the damage tolerance... supported by test evidence. The extent of damage for residual strength evaluation at any time within...

  14. Phosphorylation of human INO80 is involved in DNA damage tolerance

    SciTech Connect

    Kato, Dai; Waki, Mayumi; Umezawa, Masaki; Aoki, Yuka; Utsugi, Takahiko; Ohtsu, Masaya; Murakami, Yasufumi

    2012-01-06

    Highlights: Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced PCNA ubiquitination. Black-Right-Pointing-Pointer Depletion of hINO80 significantly reduced nuclear dots intensity of RAD18 after UV irradiation. Black-Right-Pointing-Pointer Western blot analyses showed phosphorylated hINO80 C-terminus. Black-Right-Pointing-Pointer Overexpression of phosphorylation mutant hINO80 reduced PCNA ubiquitination. -- Abstract: Double strand breaks (DSBs) are the most serious type of DNA damage. DSBs can be generated directly by exposure to ionizing radiation or indirectly by replication fork collapse. The DNA damage tolerance pathway, which is conserved from bacteria to humans, prevents this collapse by overcoming replication blockages. The INO80 chromatin remodeling complex plays an important role in the DNA damage response. The yeast INO80 complex participates in the DNA damage tolerance pathway. The mechanisms regulating yINO80 complex are not fully understood, but yeast INO80 complex are necessary for efficient proliferating cell nuclear antigen (PCNA) ubiquitination and for recruitment of Rad18 to replication forks. In contrast, the function of the mammalian INO80 complex in DNA damage tolerance is less clear. Here, we show that human INO80 was necessary for PCNA ubiquitination and recruitment of Rad18 to DNA damage sites. Moreover, the C-terminal region of human INO80 was phosphorylated, and overexpression of a phosphorylation-deficient mutant of human INO80 resulted in decreased ubiquitination of PCNA during DNA replication. These results suggest that the human INO80 complex, like the yeast complex, was involved in the DNA damage tolerance pathway and that phosphorylation of human INO80 was involved in the DNA damage tolerance pathway. These findings provide new insights into the DNA damage tolerance pathway in mammalian cells.

  15. Low velocity instrumented impact testing of four new damage tolerant carbon/epoxy composite systems

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1990-01-01

    Low velocity drop weight instrumented impact testing was utilized to examine the damage resistance of four recently developed carbon fiber/epoxy resin systems. A fifth material, T300/934, for which a large data base exists, was also tested for comparison purposes. A 16-ply quasi-isotropic lay-up configuration was used for all the specimens. Force/absorbed energy-time plots were generated for each impact test. The specimens were cross-sectionally analyzed to record the damage corresponding to each impact energy level. Maximum force of impact versus impact energy plots were constructed to compare the various systems for impact damage resistance. Results show that the four new damage tolerant fiber/resin systems far outclassed the T300/934 material. The most damage tolerant material tested was the IM7/1962 fiber/resin system.

  16. Genetic variation in herbivore resistance and tolerance: the role of plant life-history stage and type of damage.

    PubMed

    Muola, A; Mutikainen, P; Laukkanen, L; Lilley, M; Leimu, R

    2010-10-01

    Information of the patterns of genetic variation in plant resistance and tolerance against herbivores and genetic trade-offs between these two defence strategies is central for our understanding of the evolution of plant defence. We found genetic variation in resistance to two specialist herbivores and in tolerance to artificial damage but not to a specialist leaf herbivore in a long-lived perennial herb. Seedlings tended to have genetic variation in tolerance to artificial damage. Genetic variation in tolerance of adult plants to artificial damage was not consistent in time. Our results suggest that the level of genetic variation in tolerance and resistance depends on plant life-history stage, type of damage and timing of estimating the tolerance relative to the occurrence of the damage, which might reflect the pattern of selection imposed by herbivory. Furthermore, we found no trade-offs between resistance and tolerance, which suggests that the two defence strategies can evolve independently.

  17. Collection, processing, and reporting of damage tolerant design data for non-aerospace structural materials

    NASA Technical Reports Server (NTRS)

    Huber, P. D.; Gallagher, J. P.

    1994-01-01

    This report describes the organization, format and content of the NASA Johnson damage tolerant database which was created to store damage tolerant property data for non aerospace structural materials. The database is designed to store fracture toughness data (K(sub IC), K(sub c), J(sub IC) and CTOD(sub IC)), resistance curve data (K(sub R) VS. delta a (sub eff) and JR VS. delta a (sub eff)), as well as subcritical crack growth data (a vs. N and da/dN vs. delta K). The database contains complementary material property data for both stainless and alloy steels, as well as for aluminum, nickel, and titanium alloys which were not incorporated into the Damage Tolerant Design Handbook database.

  18. Damage tolerance of candidate thermoset composites for use on single stage to orbit vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Lance, D.; Hodge, A.

    1994-01-01

    Four fiber/resin systems were compared for resistance to damage and damage tolerance. One toughened epoxy and three toughened bismaleimide (BMI) resins were used, all with IM7 carbon fiber reinforcement. A statistical design of experiments technique was used to evaluate the effects of impact energy, specimen thickness, and impactor diameter on the damage area, as computed by C-scans, and residual compression-after-impact (CAI) strength. Results showed that two of the BMI systems sustained relatively large damage zones yet had an excellent retention of CAI strength.

  19. Effect of resin on impact damage tolerance of graphite/epoxy laminates

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Rhodes, M. D.

    1982-01-01

    Twenty-four different epoxy resin systems were evaluated by a variety of test techniques to identify materials that exhibited improved impact damage tolerance in graphite/epoxy composite laminates. Forty-eight-ply composite panels of five of the material systems were able to sustain 100 m/s impact by a 1.27-cm-diameter aluminum projectile while statically loaded to strains of 0.005. Of the five materials with the highest tolerance to impact, two had elastomeric additives, two had thermoplastic additives, and one had a vinyl modifier; all the five systems used bisphenol A as the base resin. An evaluation of test results shows that the laminate damage tolerance is largely determined by the resin tensile properties, and that improvements in laminate damage tolerance are not necessarily made at the expense of room-temperature mechanical properties. The results also suggest that a resin volume fraction of 40 percent or greater may be required to permit the plastic flow between fibers necessary for improved damage tolerance.

  20. Safe-life and damage-tolerant design approaches for helicopter structures

    NASA Technical Reports Server (NTRS)

    Reddick, H. K., Jr.

    1983-01-01

    The safe-life and damage-tolerant design approaches discussed apply to both metallic and fibrous composite helicopter structures. The application of these design approaches to fibrous composite structures is emphasized. Safe-life and damage-tolerant criteria are applied to all helicopter flight critical components, which are generally categorized as: dynamic components with a main and tail rotor system, which includes blades, hub and rotating controls, and drive train which includes transmission, and main and interconnecting rotor shafts; and the airframe, composed of the fuselage, aerodynamic surfaces, and landing gear.

  1. Nrf2 as a master regulator of tissue damage control and disease tolerance to infection.

    PubMed

    Soares, Miguel P; Ribeiro, Ana M

    2015-08-01

    Damage control refers to those actions made towards minimizing damage or loss. Depending on the context, these can range from emergency procedures dealing with the sinking of a ship or to a surgery dealing with severe trauma or even to an imaginary company in Marvel comics, which repairs damaged property arising from conflicts between super heroes and villains. In the context of host microbe interactions, tissue damage control refers to an adaptive response that limits the extent of tissue damage associated with infection. Tissue damage control can limit the severity of infectious diseases without interfering with pathogen burden, conferring disease tolerance to infection. This contrasts with immune-driven resistance mechanisms, which although essential to protect the host from infection, can impose tissue damage to host parenchyma tissues. This damaging effect is countered by stress responses that confer tissue damage control and disease tolerance to infection. Here we discuss how the stress response regulated by the transcription factor nuclear factor-erythroid 2-related factor 2 (Nrf2) acts in such a manner.

  2. Nrf2 as a master regulator of tissue damage control and disease tolerance to infection

    PubMed Central

    Soares, Miguel P.; Ribeiro, Ana M.

    2015-01-01

    Damage control refers to those actions made towards minimizing damage or loss. Depending on the context, these can range from emergency procedures dealing with the sinking of a ship or to a surgery dealing with severe trauma or even to an imaginary company in Marvel comics, which repairs damaged property arising from conflicts between super heroes and villains. In the context of host microbe interactions, tissue damage control refers to an adaptive response that limits the extent of tissue damage associated with infection. Tissue damage control can limit the severity of infectious diseases without interfering with pathogen burden, conferring disease tolerance to infection. This contrasts with immune-driven resistance mechanisms, which although essential to protect the host from infection, can impose tissue damage to host parenchyma tissues. This damaging effect is countered by stress responses that confer tissue damage control and disease tolerance to infection. Here we discuss how the stress response regulated by the transcription factor nuclear factor-erythroid 2-related factor 2 (Nrf2) acts in such a manner. PMID:26551709

  3. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  4. Fuel containment, lightning protection and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, Charles F.; James, Arthur M.

    1985-01-01

    The damage-tolerance characteristics of high strain-to-failure graphite fibers and toughened resins were evaluated. Test results show that conventional fuel tank sealing techniques are applicable to composite structures. Techniques were developed to prevent fuel leaks due to low-energy impact damage. For wing panels subjected to swept stroke lightning strikes, a surface protection of graphite/aluminum wire fabric and a fastener treatment proved effective in eliminating internal sparking and reducing structural damage. The technology features developed were incorporated and demonstrated in a test panel designed to meet the strength, stiffness, and damage tolerance requirements of a large commercial transport aircraft. The panel test results exceeded design requirements for all test conditions. Wing surfaces constructed with composites offer large weight savings if design allowable strains for compression can be increased from current levels.

  5. Damage tolerance of woven graphite-epoxy buffer strip panels

    NASA Technical Reports Server (NTRS)

    Kennedy, John M.

    1990-01-01

    Graphite-epoxy panels with S glass buffer strips were tested in tension and shear to measure their residual strengths with crack-like damage. The buffer strips were regularly spaced narrow strips of continuous S glass. Panels were made with a uniweave graphite cloth where the S glass buffer material was woven directly into the cloth. Panels were made with different width and thickness buffer strips. The panels were loaded to failure while remote strain, strain at the end of the slit, and crack opening displacement were monitoring. The notched region and nearby buffer strips were radiographed periodically to reveal crack growth and damage. Except for panels with short slits, the buffer strips arrested the propagating crack. The strength (or failing strain) of the panels was significantly higher than the strength of all-graphite panels with the same length slit. Panels with wide, thick buffer strips were stronger than panels with thin, narrow buffer strips. A shear-lag model predicted the failing strength of tension panels with wide buffer strips accurately, but over-estimated the strength of the shear panels and the tension panels with narrow buffer strips.

  6. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Damage tolerance and fatigue evaluation of structure. 23.573 Section 23.573 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... operational life of the airplane must be consistent with the initial detectability and subsequent growth...

  7. New discoveries linking transcription to DNA repair and damage tolerance pathways.

    PubMed

    Cohen, Susan E; Walker, Graham C

    2011-01-01

    In Escherichia coli, the transcription elongation factor NusA is associated with all elongating RNA polymerases where it functions in transcription termination and antitermination. Here, we review our recent results implicating NusA in the recruitment of DNA repair and damage tolerance mechanisms to sites of stalled transcription complexes.

  8. Fuel containment and damage tolerance in large composite primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Griffin, C. F.

    1983-01-01

    Technical problems related to fuel containment and damage tolerance of composite material wings for transport aircraft was investigated. The major tasks are the following: (1) the preliminary design of damage tolerant wing surface using composite materials; (2) the evaluation of fuel sealing and lightning protection methods for a composite material wing; and (3) an experimental investigation of the damage tolerant characteristics of toughened resin graphite/epoxy materials. The design concepts investigated for the upper and lower surfaces of a composite wing for a transport aircraft are presented and the relationship between weight savings and the design allowable strain used within the analysis is discussed. Experiments which compare the fuel sealing characteristics of bolt-bonded joints and bolted joints sealed with a polysulphide sealant are reviewed. Data from lightning strike tests on stiffened and unstiffened graphite/epoxy panels are presented. A wide variety of coupon tests were conducted to evaluate the relative damage tolerance of toughened resin graphite/epoxies. Data from these tests are presented and their relevance to the wing surface design concepts are discussed.

  9. Assessment of the Damage Tolerance of Postbuckled Hat-Stiffened Panels Using Single-Stringer Specimens

    NASA Technical Reports Server (NTRS)

    Bisagni, Chiara; Vescovini, Riccardo; Davila, Carlos G.

    2010-01-01

    A procedure is proposed for the assessment of the damage tolerance and collapse of stiffened composite panels using a single-stringer compression specimen. The dimensions of the specimen are determined such that the specimen s nonlinear response and collapse are representative of an equivalent multi-stringer panel in compression. Experimental tests are conducted on specimens with and without an embedded delamination. A shell-based finite element model with intralaminar and interlaminar damage capabilities is developed to predict the postbuckling response as well as the damage evolution from initiation to collapse.

  10. An assessment of buffer strips for improving damage tolerance

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.; Kennedy, J. M.

    1981-01-01

    Graphite/epoxy panels with buffer strips were tested in tension to measure their residual strength with crack-like damage. Panels were made with 45/0/-45/90(2S) and 45/0/450(2S) layups. The buffer strips were parallel to the loading directions. They were made by replacing narrow strips of the 0 deg graphite plies with strips of either 0 deg S-Glass/epoxy or Kevlar-49/epoxy on either a one for one or a two for one basis. In a third case, O deg graphite/epoxy was used as the buffer material and thin, perforated Mylar strips were placed between the 0 deg piles and the cross-plies to weaken the interfaces and thus to isolate the 0 deg plies. Some panels were made with buffer strips of different widths and spacings. The buffer strips arrested the cracks and increased the residual strengths significantly over those plain laminates without buffer strips. A shear-lag type stress analysis correctly predicted the effects of layups, buffer material, buffer strip width and spacing, and the number of plies of buffer material.

  11. DNA damage tolerance: a double-edged sword guarding the genome

    PubMed Central

    Ghosal, Gargi; Chen, Junjie

    2013-01-01

    Preservation of genome integrity is an essential process for cell homeostasis. During the course of life of a single cell, the genome is constantly damaged by endogenous and exogenous agents. To ensure genome stability, cells use a global signaling network, namely the DNA damage response (DDR) to sense and repair DNA damage. DDR senses different types of DNA damage and coordinates a response that includes activation of transcription, cell cycle control, DNA repair pathways, apoptosis, senescence, and cell death. Despite several repair mechanisms that repair different types of DNA lesions, it is likely that the replication machinery would still encounter lesions that are mis-repaired or not repaired. Replication of damaged genome would result in high frequency of fork collapse and genome instability. In this scenario, the cells employ the DNA damage tolerance (DDT) pathway that recruits a specialized low fidelity translesion synthesis (TLS) polymerase to bypass the lesions for repair at a later time point. Thus, DDT is not a repair pathway per se, but provides a mechanism to tolerate DNA lesions during replication thereby increasing survival and preventing genome instability. Paradoxically, DDT process is also associated with increased mutagenesis, which can in turn drive the cell to cancer development. Thus, DDT process functions as a double-edged sword guarding the genome. In this review, we will discuss the replication stress induced DNA damage-signaling cascade, the stabilization and rescue of stalled replication forks by the DDT pathway and the effect of the DDT pathway on cancer. PMID:24058901

  12. Durability and damage tolerance of Large Composite Primary Aircraft Structure (LCPAS)

    NASA Technical Reports Server (NTRS)

    Mccarty, John E.; Roeseler, William G.

    1984-01-01

    Analysis and testing addressing the key technology areas of durability and damage tolerance were completed for wing surface panels. The wing of a fuel-efficient, 200-passenger commercial transport airplane for 1990 delivery was sized using graphite-epoxy materials. Coupons of various layups used in the wing sizing were tested in tension, compression, and spectrum fatigue with typical fastener penetrations. The compression strength after barely visible impact damage was determined from coupon and structural element tests. One current material system and one toughened system were evaluated by coupon testing. The results of the coupon and element tests were used to design three distinctly different compression panels meeting the strength, stiffness, and damage-tolerance requirements of the upper wing panels. These three concepts were tested with various amounts of damage ranging from barely visible impact to through-penetration. The results of this program provide the key technology data required to assess the durability and damage-tolerance capability or advanced composites for use in commercial aircraft wing panel structure.

  13. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  14. Materials and processes laboratory composite materials characterization task, part 1. Damage tolerance

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Tucker, D. S.; Patterson, W. J.; Franklin, S. W.; Gordon, G. H.; Hart, L.; Hodge, A. J.; Lance, D. G.; Russel, S. S.

    1991-01-01

    A test run was performed on IM6/3501-6 carbon-epoxy in which the material was processed, machined into specimens, and tested for damage tolerance capabilities. Nondestructive test data played a major role in this element of composite characterization. A time chart was produced showing the time the composite material spent within each Branch or Division in order to identify those areas which produce a long turnaround time. Instrumented drop weight testing was performed on the specimens with nondestructive evaluation being performed before and after the impacts. Destructive testing in the form of cross-sectional photomicrography and compression-after-impact testing were used. Results show that the processing and machining steps needed to be performed more rapidly if data on composite material is to be collected within a reasonable timeframe. The results of the damage tolerance testing showed that IM6/3501-6 is a brittle material that is very susceptible to impact damage.

  15. The Regulation of DNA Damage Tolerance by Ubiquitin and Ubiquitin-Like Modifiers

    PubMed Central

    Cipolla, Lina; Maffia, Antonio; Bertoletti, Federica; Sabbioneda, Simone

    2016-01-01

    DNA replication is an extremely complex process that needs to be executed in a highly accurate manner in order to propagate the genome. This task requires the coordination of a number of enzymatic activities and it is fragile and prone to arrest after DNA damage. DNA damage tolerance provides a last line of defense that allows completion of DNA replication in the presence of an unrepaired template. One of such mechanisms is called post-replication repair (PRR) and it is used by the cells to bypass highly distorted templates caused by damaged bases. PRR is extremely important for the cellular life and performs the bypass of the damage both in an error-free and in an error-prone manner. In light of these two possible outcomes, PRR needs to be tightly controlled in order to prevent the accumulation of mutations leading ultimately to genome instability. Post-translational modifications of PRR proteins provide the framework for this regulation with ubiquitylation and SUMOylation playing a pivotal role in choosing which pathway to activate, thus controlling the different outcomes of damage bypass. The proliferating cell nuclear antigen (PCNA), the DNA clamp for replicative polymerases, plays a central role in the regulation of damage tolerance and its modification by ubiquitin, and SUMO controls both the error-free and error-prone branches of PRR. Furthermore, a significant number of polymerases are involved in the bypass of DNA damage possess domains that can bind post-translational modifications and they are themselves target for ubiquitylation. In this review, we will focus on how ubiquitin and ubiquitin-like modifications can regulate the DNA damage tolerance systems and how they control the recruitment of different proteins to the replication fork. PMID:27379156

  16. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  17. Reduced calcium-dependent mitochondrial damage underlies the reduced vulnerability of excitotoxicity-tolerant hippocampal neurons.

    PubMed

    Pivovarova, Natalia B; Stanika, Ruslan I; Watts, Charlotte A; Brantner, Christine A; Smith, Carolyn L; Andrews, S Brian

    2008-03-01

    In central neurons, over-stimulation of NMDA receptors leads to excessive mitochondrial calcium accumulation and damage, which is a critical step in excitotoxic death. This raises the possibility that low susceptibility to calcium overload-induced mitochondrial damage might characterize excitotoxicity-resistant neurons. In this study, we have exploited two complementary models of preconditioning-induced excitotoxicity resistance to demonstrate reduced calcium-dependent mitochondrial damage in NMDA-tolerant hippocampal neurons. We have further identified adaptations in mitochondrial calcium handling that account for enhanced mitochondrial integrity. In both models, enhanced tolerance was associated with improved preservation of mitochondrial membrane potential and structure. In the first model, which exhibited modest neuroprotection, mitochondria-dependent calcium deregulation was delayed, even though cytosolic and mitochondrial calcium loads were quantitatively unchanged, indicating that enhanced mitochondrial calcium capacity accounts for reduced injury. In contrast, the second model, which exhibited strong neuroprotection, displayed further delayed calcium deregulation and reduced mitochondrial damage because downregulation of NMDA receptor surface expression depressed calcium loading. Reducing calcium entry also modified the chemical composition of the calcium-buffering precipitates that form in calcium-loaded mitochondria. It thus appears that reduced mitochondrial calcium loading is a major factor underlying the robust neuroprotection seen in highly tolerant cells. PMID:18036152

  18. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerard

    2009-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. However, inflatable structures carry different inherent risks and are at a lower Technical Readiness Level (TRL) than more conventional metallic structures. One of the risks associated with inflatable structures is in understanding the tolerance to induced damage. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure. TransHab (Figure 1) was an experimental inflatable module developed at the NASA/Johnson Space Center in the 1990 s. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability with analytical predictions of stresses and deflections. Based on JSC s experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The DTT article was inflated to 45 psig, representing 25% of the ultimate burst pressure, and one of the one-inch wide longitudinal structural members was severed by initiating a Linear Shaped Charge (LSC). Strain gage measurements, at the interface between the expandable elements (straps) and the nonexpandable metallic elements for pre-selected longitudinal straps, were taken throughout pressurization of the module and strap separation. Strain gage measurements show no change in longitudinal strap loading at the bulkhead interface after strap separation indicating loads in the restraint layer were re-distributed local to the damaged area due to the effects of friction under high internal pressure loading. The test completed all primary objectives with better than

  19. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    SciTech Connect

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the model to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.

  20. Damage Tolerance Enhancement of Carbon Fiber Reinforced Polymer Composites by Nanoreinforcement of Matrix

    NASA Astrophysics Data System (ADS)

    Fenner, Joel Stewart

    Nanocomposites are a relatively new class of materials which incorporate exotic, engineered nanoparticles to achieve superior material properties. Because of their extremely small size and well-ordered structure, many nanoparticles possess properties that exceed those offered by a wide range of other known materials, making them attractive candidates for novel materials engineering development. Their small size is also an impediment to their practical use, as they typically cannot be employed by themselves to realize those properties in large structures. Furthermore, nanoparticles typically possess strong self-affinity, rendering them difficult to disperse uniformly into a composite. However, contemporary research has shown that, if well-dispersed, nanoparticles have great capacity to improve the mechanical properties of composites, especially damage tolerance, in the form of fracture toughness, fatigue life, and impact damage mitigation. This research focuses on the development, manufacturing, and testing of hybrid micro/nanocomposites comprised of woven carbon fibers with a carbon nanotube reinforced epoxy matrix. Material processing consisted of dispersant-and-sonication based methods to disperse nanotubes into the matrix, and a vacuum-assisted wet lay-up process to prepare the hybrid composite laminates. Various damage tolerance properties of the hybrid composite were examined, including static strength, fracture toughness, fatigue life, fatigue crack growth rate, and impact damage behavior, and compared with similarly-processed reference material produced without nanoreinforcement. Significant improvements were obtained in interlaminar shear strength (15%), Mode-I fracture toughness (180%), shear fatigue life (order of magnitude), Mode-I fatigue crack growth rate (factor of 2), and effective impact damage toughness (40%). Observations by optical microscopy, scanning electron microscopy, and ultrasonic imaging showed significant differences in failure behavior

  1. Damage tolerance and assessment of unidirectional carbon fiber composites: An experimental and numerical study

    NASA Astrophysics Data System (ADS)

    Flores, Mark David

    Composites are beginning to be used in a variety of different applications throughout industry. However, certification and damage tolerance is a growing concern in many aerospace and marine applications. Although compression-after-impact have been studied thoroughly, determining a damage tolerance methodology that accurately characterizes the failure of composites has not been established. An experimental investigation was performed to study the effect of stacking sequence, low-velocity impact response, and residual strength due to compression and fatigue. Digital Image Correlation (DIC) captured the strains and deformation of the plate due to compression. Computational investigations integrated non-destructive techniques (C-Scan, X-Ray) to determine the extent of the damage created by the manufacturing process and impact to accurately create a representative of the pre-existing damage. Fiber/matrix cracking, delamination growth, buckling, as well as other failures mechanisms occur in compression-after-impact laminated specimens examined experimentally. The results from this study provide knowledge of the compression after impact strength of plates, and a basis for validation of detailed modeling of progressive failure from impact damaged composites.

  2. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  3. Advanced Durability and Damage Tolerance Design and Analysis Methods for Composite Structures: Lessons Learned from NASA Technology Development Programs

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Shuart, Mark J.

    2003-01-01

    Aerospace vehicles are designed to be durable and damage tolerant. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. However, both durability and damage tolerance design methodologies must address the deleterious effects of changes in material properties and the initiation and growth of microstructural damage that may occur during the service lifetime of the vehicle. Durability and damage tolerance design and certification requirements are addressed for commercial transport aircraft and NASA manned spacecraft systems. The state-of-the-art in advanced design and analysis methods is illustrated by discussing the results of several recently completed NASA technology development programs. These programs include the NASA Advanced Subsonic Technology Program demonstrating technologies for large transport aircraft and the X-33 hypersonic test vehicle demonstrating technologies for a single-stage-to-orbit space launch vehicle.

  4. Fatigue crack growth in damage tolerant Al-Li sheet alloys

    NASA Astrophysics Data System (ADS)

    Wanhill, R. J. H.

    1990-03-01

    The fatigue crack growth properties of two candidate damage tolerant Al-Li sheet alloys, 2091 and 8090 are compared with those of the conventional and widely used 2024 alloy. There were three load histories: constant amplitude, gust spectrum, and constant amplitude with occasional peak loads. The results are interpreted with the aid of fractographic observations and measurements of fracture surface roughness. The practical significance of the results is assessed, and recommendations are made for further evaluations.

  5. Variation and fitness costs for tolerance to different types of herbivore damage in Boechera stricta genotypes with contrasting glucosinolate structures

    PubMed Central

    Manzaneda, Antonio J.; Prasad, Kasavajhala V. S. K.; Mitchell-Olds, Thomas

    2010-01-01

    Summary Analyses of plant tolerance in response to different modes of herbivory are essential to understand plant defense evolution, yet are still scarce. Allocation costs and trade-offs between tolerance and plant chemical defenses may influence genetic variation for tolerance. However, variation in defenses occurs also for presence or absence of discrete chemical structures, yet, effects of intra-specific polymorphisms on tolerance to multiple herbivores have not been evaluated.Here, in a glasshouse experiment, we investigated variation for tolerance to different types of herbivory damage, and direct allocation costs in 10 genotypes of Boechera stricta (Brassicaceae), a wild relative of Arabidopsis, with contrasting foliar glucosinolate chemical structures (methionine-derived glucosinolates vs glucosinolates derived from branched-chain amino acids).We found significant genetic variation for tolerance to different types of herbivory. Structural variations in the glucosinolate profile did not influence tolerance to damage, but predicted plant fitness. Levels of constitutive and induced glucosinolates varied between genotypes with different structural profiles, but we did not detect any cost of tolerance explaining genetic variation in tolerance among genotypes.Trade-offs among plant tolerance to multiple herbivores may not explain the existence of intermediate levels of tolerance to damage in plants with contrasting chemical defensive profiles. PMID:20663059

  6. Damage Tolerance of Pre-Stressed Composite Panels Under Impact Loads

    NASA Astrophysics Data System (ADS)

    Johnson, Alastair F.; Toso-Pentecôte, Nathalie; Schueler, Dominik

    2014-02-01

    An experimental test campaign studied the structural integrity of carbon fibre/epoxy panels preloaded in tension or compression then subjected to gas gun impact tests causing significant damage. The test programme used representative composite aircraft fuselage panels composed of aerospace carbon fibre toughened epoxy prepreg laminates. Preload levels in tension were representative of design limit loads for fuselage panels of this size, and maximum compression preloads were in the post-buckle region. Two main impact scenarios were considered: notch damage from a 12 mm steel cube projectile, at velocities in the range 93-136 m/s; blunt impact damage from 25 mm diameter glass balls, at velocities 64-86 m/s. The combined influence of preload and impact damage on panel residual strengths was measured and results analysed in the context of damage tolerance requirements for composite aircraft panels. The tests showed structural integrity well above design limit loads for composite panels preloaded in tension and compression with visible notch impact damage from hard body impact tests. However, blunt impact tests on buckled compression loaded panels caused large delamination damage regions which lowered plate bending stiffness and reduced significantly compression strengths in buckling.

  7. Damage tolerance evaluation of PEEK (polyether ether ketone) composites: Final report

    SciTech Connect

    Frazier, J.L.

    1988-12-01

    A polyether ether ketone (PEEK) thermoplastic system is currently being evaluated in flight service as a structural element for the US Air Force C-130 transport plane. The particular structure under study is the C-130 belly skin, a fuselage panel that is located on the underside of the aircraft and is subjected to impact from runway debris. A current Air Force objective is to reduce maintenance and replacement requirements of aircraft using lightweight composite structures to replace or supplement existing metal alloy components. The incorporation of lighter weight composite structures would result in aircraft weight reductions, allowing greater range and fuel economy. The impact-damage susceptibility of composite structures often results in strain-limited application of composite materials where the mechanical properties' advantages over traditional metal alloys are not attained. Methods developed to enhance the damage tolerance of composite material systems should increase their potential uses in existing and future aircraft. A materials evaluation program was conducted to determine the possible benefits of interleaving thermoplastic film layers between the plies of a PEEK/graphite composite material system to produce a material system with increased resistance to impact damage. Several laminate designs incorporating PEEK thermoplastic film as an interleaf material were subjected to impacts of various energies and projectile velocities. Mechanical properties of unimpacted, open-hole, and impacted laminate panels were measured to determine the effectiveness of the interleaf concept for improving damage tolerance relative to the existing baseline material. 5 refs., 19 figs., 8 tabs.

  8. Probabilistic Assessment of Fracture Progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank

    1999-01-01

    This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.

  9. An assessment of buffer strips for improving damage tolerance of composite laminates at elevated temperature

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1981-01-01

    Buffer strips greatly improve the damage tolerance of graphite/epoxy laminates loaded in tension. Graphite/polyimide buffer strip panels were made and tested to determine their residual strength at ambient and elevated (177 C) temperature. Each panel was cut in the center to represent damage. Panels were radiographed and crack-opening displacements were recorded to indicate fracture, fracture arrest, and the extent of damage in the buffer strip after arrest. All panels had the same buffer strip spacing and width. The buffer strip material was 0 deg S-glass/PMR-15. The buffer strips were made by replacing narrow strips of the 0 deg graphite plies with strips of the 0 deg S-glass on either a one-for-one or a two-for-one basis. Half of the panels were heated to 177 + or - 3 C before and during the testing. Elevated temperature did not alter the fracture behavior of the buffer configuration.

  10. Recent developments in the design, testing and impact-damage tolerance of stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Anderson, M. S.; Rhodes, M. D.; Starnes, J. H., Jr.; Stroud, W. J.

    1980-01-01

    The structural technology of laminated filamentary-composite stiffened-panel structures under combined in-plane and lateral loadings is discussed. Emphasis is on analyzing the behavior of the structures under load, determining appropriate structural proportions for weight efficient configurations, and effects of impact damage and geometric imperfections on structural performance. Experimental data on buckling of panels under in-plane compression validate the analysis and sizing methods, and illustrate structural performance and efficiency obtained from representative structures. It is shown that the strength of panels under in-plane compression can be degraded by low-velocity impact damage, and data are presented which indicate that the matrix is a significant factor influencing tolerance to impact damage.

  11. Nitroglycerin induces DNA damage and vascular cell death in the setting of nitrate tolerance.

    PubMed

    Mikhed, Yuliya; Fahrer, Jörg; Oelze, Matthias; Kröller-Schön, Swenja; Steven, Sebastian; Welschof, Philipp; Zinßius, Elena; Stamm, Paul; Kashani, Fatemeh; Roohani, Siyer; Kress, Joana Melanie; Ullmann, Elisabeth; Tran, Lan P; Schulz, Eberhard; Epe, Bernd; Kaina, Bernd; Münzel, Thomas; Daiber, Andreas

    2016-07-01

    Nitroglycerin (GTN) and other organic nitrates are widely used vasodilators. Their side effects are development of nitrate tolerance and endothelial dysfunction. Given the potential of GTN to induce nitro-oxidative stress, we investigated the interaction between nitro-oxidative DNA damage and vascular dysfunction in experimental nitrate tolerance. Cultured endothelial hybridoma cells (EA.hy 926) and Wistar rats were treated with GTN (ex vivo: 10-1000 µM; in vivo: 10, 20 and 50 mg/kg/day for 3 days, s.c.). The level of DNA strand breaks, 8-oxoguanine and O (6)-methylguanine DNA adducts was determined by Comet assay, dot blot and immunohistochemistry. Vascular function was determined by isometric tension recording. DNA adducts and strand breaks were induced by GTN in cells in vitro in a concentration-dependent manner. GTN in vivo administration leads to endothelial dysfunction, nitrate tolerance, aortic and cardiac oxidative stress, formation of DNA adducts, stabilization of p53 and apoptotic death of vascular cells in a dose-dependent fashion. Mice lacking O (6)-methylguanine-DNA methyltransferase displayed more vascular O (6)-methylguanine adducts and oxidative stress under GTN therapy than wild-type mice. Although we were not able to prove a causal role of DNA damage in the etiology of nitrate tolerance, the finding of GTN-induced DNA damage such as the mutagenic and toxic adduct O (6)-methylguanine, and cell death supports the notion that GTN based therapy may provoke adverse side effects, including endothelial function. Further studies are warranted to clarify whether GTN pro-apoptotic effects are related to an impaired recovery of patients upon myocardial infarction.

  12. Modeling continuous-fiber reinforced polymer composites for exploration of damage tolerant concepts

    NASA Astrophysics Data System (ADS)

    Matthews, Peter J.

    This work aims to improve the predictive capability for fiber-reinforced polymer matrix composite laminates using the finite element method. A new tool for modeling composite damage was developed which considers important modes of failure. Well-known micromechanical models were implemented to predict material values for material systems of interest to aerospace applications. These generated material values served as input to intralaminar and interlaminar damage models. A three-dimensional in-plane damage material model was implemented and behavior verified. Deficiencies in current state-of-the-art interlaminar capabilities were explored using the virtual crack closure technique and the cohesive zone model. A user-defined cohesive element was implemented to discover the importance of traction-separation material constitutive behavior. A novel method for correlation of traction-separation parameters was created. This new damage modeling tool was used for evaluation of novel material systems to improve damage tolerance. Classical laminate plate theory was used in a full-factorial study of layerwise-hybrid laminates. Filament-wound laminated composite cylindrical shells were subjected to quasi-static loading to validate the finite element computational composite damage model. The new tool for modeling provides sufficient accuracy and generality for use on a wide-range of problems.

  13. DNA lesion identity drives choice of damage tolerance pathway in murine cell chromosomes.

    PubMed

    Cohen, Isadora S; Bar, Carmit; Paz-Elizur, Tamar; Ainbinder, Elena; Leopold, Karoline; de Wind, Niels; Geacintov, Nicholas; Livneh, Zvi

    2015-02-18

    DNA-damage tolerance (DDT) via translesion DNA synthesis (TLS) or homology-dependent repair (HDR) functions to bypass DNA lesions encountered during replication, and is critical for maintaining genome stability. Here, we present piggyBlock, a new chromosomal assay that, using piggyBac transposition of DNA containing a known lesion, measures the division of labor between the two DDT pathways. We show that in the absence of DNA damage response, tolerance of the most common sunlight-induced DNA lesion, TT-CPD, is achieved by TLS in mouse embryo fibroblasts. Meanwhile, BP-G, a major smoke-induced DNA lesion, is bypassed primarily by HDR, providing the first evidence for this mechanism being the main tolerance pathway for a biologically important lesion in a mammalian genome. We also show that, far from being a last-resort strategy as it is sometimes portrayed, TLS operates alongside nucleotide excision repair, handling 40% of TT-CPDs in repair-proficient cells. Finally, DDT acts in mouse embryonic stem cells, exhibiting the same pattern—mutagenic TLS included—despite the risk of propagating mutations along all cell lineages. The new method highlights the importance of HDR, and provides an effective tool for studying DDT in mammalian cells. PMID:25589543

  14. DNA lesion identity drives choice of damage tolerance pathway in murine cell chromosomes

    PubMed Central

    Cohen, Isadora S.; Bar, Carmit; Paz-Elizur, Tamar; Ainbinder, Elena; Leopold, Karoline; de Wind, Niels; Geacintov, Nicholas; Livneh, Zvi

    2015-01-01

    DNA-damage tolerance (DDT) via translesion DNA synthesis (TLS) or homology-dependent repair (HDR) functions to bypass DNA lesions encountered during replication, and is critical for maintaining genome stability. Here, we present piggyBlock, a new chromosomal assay that, using piggyBac transposition of DNA containing a known lesion, measures the division of labor between the two DDT pathways. We show that in the absence of DNA damage response, tolerance of the most common sunlight-induced DNA lesion, TT-CPD, is achieved by TLS in mouse embryo fibroblasts. Meanwhile, BP-G, a major smoke-induced DNA lesion, is bypassed primarily by HDR, providing the first evidence for this mechanism being the main tolerance pathway for a biologically important lesion in a mammalian genome. We also show that, far from being a last-resort strategy as it is sometimes portrayed, TLS operates alongside nucleotide excision repair, handling 40% of TT-CPDs in repair-proficient cells. Finally, DDT acts in mouse embryonic stem cells, exhibiting the same pattern—mutagenic TLS included—despite the risk of propagating mutations along all cell lineages. The new method highlights the importance of HDR, and provides an effective tool for studying DDT in mammalian cells. PMID:25589543

  15. Damage Tolerance Testing of a NASA TransHab Derivative Woven Inflatable Module

    NASA Technical Reports Server (NTRS)

    Edgecombe, John; delaFuente, Horacio; Valle, Gerald D.

    2008-01-01

    Current options for Lunar habitat architecture include inflatable habitats and airlocks. Inflatable structures can have mass and volume advantages over conventional structures. Inflatable structures are perceived to carry additional risk because they are at a lower Technical Readiness Level (TRL) than conventional metallic structures. One of the risks associated with inflatable structures is understanding the tolerance to component damage and the resulting behavior of the system after the damage is introduced. The Damage Tolerance Test (DTT) is designed to study the structural integrity of an expandable structure during and subsequent to induced damage. The TransHab Project developed an experimental inflatable module developed at Johnson Space Center in the 1990's. The TransHab design was originally envisioned for use in Mars Transits but was also studied as a potential habitat for the International Space Station (ISS). The design of the TransHab module was based on a woven design using an Aramid fabric. Testing of this design demonstrated a high level of predictability and repeatability and good correlation with analytical predictions of stresses and deflections. Based on JSC's experience with the design and analysis of woven inflatable structures, the Damage Tolerance Test article was designed and fabricated using a woven design. The Damage Tolerance Test Article consists of a load bearing restraint layer, a bladder or gas barrier, and a structural metallic core. The test article restraint layer is fabricated from one inch wide Kevlar webbing that is woven in a basket weave pattern. Underneath the structural restraint layer is the bladder or gas barrier. For this test the bladder was required to maintain pressure for testing only and was not representative of a flight design. The bladder and structural restraint layer attach to the structural core of the module at steel bulkheads at each end. The two bulkheads are separated by a 10 foot center tube which provides

  16. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict.

    PubMed

    Michels, Jan; Gorb, Stanislav N; Reinhardt, Klaus

    2015-03-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  17. Reduction of female copulatory damage by resilin represents evidence for tolerance in sexual conflict

    PubMed Central

    Michels, Jan; Gorb, Stanislav N.; Reinhardt, Klaus

    2015-01-01

    Intergenomic evolutionary conflicts increase biological diversity. In sexual conflict, female defence against males is generally assumed to be resistance, which, however, often leads to trait exaggeration but not diversification. Here, we address whether tolerance, a female defence mechanism known from interspecific conflicts, exists in sexual conflict. We examined the traumatic insemination of female bed bugs via cuticle penetration by males, a textbook example of sexual conflict. Confocal laser scanning microscopy revealed large proportions of the soft and elastic protein resilin in the cuticle of the spermalege, the female defence organ. Reduced tissue damage and haemolymph loss were identified as adaptive female benefits from resilin. These did not arise from resistance because microindentation showed that the penetration force necessary to breach the cuticle was significantly lower at the resilin-rich spermalege than at other cuticle sites. Furthermore, a male survival analysis indicated that the spermalege did not impose antagonistic selection on males. Our findings suggest that the specific spermalege material composition evolved to tolerate the traumatic cuticle penetration. They demonstrate the importance of tolerance in sexual conflict and genitalia evolution, extend fundamental coevolution and speciation models and contribute to explaining the evolution of complexity. We propose that tolerance can drive trait diversity. PMID:25673297

  18. Regulation of Rad6/Rad18 Activity During DNA Damage Tolerance.

    PubMed

    Hedglin, Mark; Benkovic, Stephen J

    2015-01-01

    Replicative polymerases (pols) cannot accommodate damaged template bases, and these pols stall when such offenses are encountered during S phase. Rather than repairing the damaged base, replication past it may proceed via one of two DNA damage tolerance (DDT) pathways, allowing replicative DNA synthesis to resume. In translesion DNA synthesis (TLS), a specialized TLS pol is recruited to catalyze stable, yet often erroneous, nucleotide incorporation opposite damaged template bases. In template switching, the newly synthesized sister strand is used as a damage-free template to synthesize past the lesion. In eukaryotes, both pathways are regulated by the conjugation of ubiquitin to the PCNA sliding clamp by distinct E2/E3 pairs. Whereas monoubiquitination by Rad6/Rad18 mediates TLS, extension of this ubiquitin to a polyubiquitin chain by Ubc13-Mms2/Rad5 routes DDT to the template switching pathway. In this review, we focus on the monoubiquitination of PCNA by Rad6/Rad18 and summarize the current knowledge of how this process is regulated. PMID:26098514

  19. Assessing inspection sensitivity as it relates to damage tolerance in composite rotor hubs

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.; Rackow, Kirk

    2001-08-01

    Increasing niche applications, growing international markets, and the emergence of advanced rotorcraft technology are expected to greatly increase the population of helicopters over the next decade. In terms of fuselage fatigue, helicopters show similar trends as fixed-wing aircraft. The highly unsteady loads experienced by rotating wings not only directly affect components in the dynamic systems but are also transferred to the fixed airframe structure. Expanded use of rotorcraft has focused attention on the use of new materials and the optimization of maintenance practices. The FAA's Airworthiness Assurance Center (AANC) at Sandia National Labs has joined with Bell Helicopter andother agencies in the rotorcraft industry to evaluate nondestructive inspection (NDI) capabilities in light of the damage tolerance of assorted rotorcraft structure components. Currently, the program's emphasis is on composite rotor hubs. The rotorcraft industry is constantly evaluating new types of lightweight composite materials that not only enhance the safety and reliability of rotor components but also improve performance and extended operating life as well. Composite rotor hubs have led to the use of bearingless rotor systems that are less complex and require less maintenance than their predecessors. The test facility described in this paper allows the structural stability and damage tolerance of composite hubs to be evaluated using realistic flight load spectrums of centrifugal force and bending loads. NDI was integrated into the life-cycle fatigue tests in order to evaluate flaw detection sensitivity simultaneously wiht residual strength and general rotor hub peformance. This paper will describe the evolving use of damage tolerance analysis (DTA) to direct and improve rotorcraft maintenance along with the related use of nondestructive inspections to manage helicopter safety. OVeralll, the data from this project will provide information to improve the producibility, inspectability

  20. Development of pressure containment and damage tolerance technology for composite fuselage structures in large transport aircraft

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Thomson, L. W.; Wilson, R. D.

    1986-01-01

    NASA sponsored composites research and development programs were set in place to develop the critical engineering technologies in large transport aircraft structures. This NASA-Boeing program focused on the critical issues of damage tolerance and pressure containment generic to the fuselage structure of large pressurized aircraft. Skin-stringer and honeycomb sandwich composite fuselage shell designs were evaluated to resolve these issues. Analyses were developed to model the structural response of the fuselage shell designs, and a development test program evaluated the selected design configurations to appropriate load conditions.

  1. Effect of Buckling Modes on the Fatigue Life and Damage Tolerance of Stiffened Structures

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.; Bisagni, Chiara; Rose, Cheryl A.

    2015-01-01

    The postbuckling response and the collapse of composite specimens with a co-cured hat stringer are investigated experimentally and numerically. These specimens are designed to evaluate the postbuckling response and the effect of an embedded defect on the collapse load and the mode of failure. Tests performed using controlled conditions and detailed instrumentation demonstrate that the damage tolerance, fatigue life, and collapse loads are closely tied with the mode of the postbuckling deformation, which can be different between two nominally identical specimens. Modes that tend to open skin/stringer defects are the most damaging to the structure. However, skin/stringer bond defects can also propagate under shearing modes. In the proposed paper, the effects of initial shape imperfections on the postbuckling modes and the interaction between different postbuckling deformations and the propagation of skin/stringer bond defects under quasi-static or fatigue loads will be examined.

  2. Long-term hygrothermal effects on damage tolerance of hybrid composite sandwich panels

    NASA Technical Reports Server (NTRS)

    Ishai, Ori; Hiel, Clement; Luft, Michael

    1995-01-01

    A sandwich construction, composed of hybrid carbon-glass fiber-reinforced plastic skins and a syntactic foam core, was selected as the design concept for a wind tunnel compressor blade application, where high damage tolerance and durability are of major importance. Beam specimens were prepared from open-edge and encapsulated sandwich panels which had previously been immersed in water at different temperatures for periods of up to about two years in the extreme case. Moisture absorption and strength characteristics, as related to time of exposure to hygrothermal conditions, were evaluated for the sandwich specimens and their constituents (skins and foam). After different exposure periods, low-velocity impact damage was inflicted on most sandwich specimens and damage characteristics were related to impact energy. Eventually, the residual compressive strengths of the damaged (and undamaged) beams were determined flexurally. Test results show that exposure to hygrothermal conditions leads to significant strength reductions for foam specimens and open-edge sandwich panels, compared with reference specimens stored at room temperature. In the case of skin specimens and for beams prepared from encapsulated sanwich panels that had previously been exposed to hygrothermal conditions, moisture absorption was found to improve strength as related to the reference case. The beneficial effect of moisture on skin performance was, however, limited to moisture contents below 1% (at 50 C and lower temperatures). Above this moisture level and at higher temperatures, strength degradation of the skin seems to prevail.

  3. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    PubMed Central

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-01-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials. PMID:26837581

  4. Crack resistance, fracture toughness and instability in damage tolerant Ai-Li alloys

    NASA Astrophysics Data System (ADS)

    Wanhill, R. J. H.; Schra, L.; Thart, W. G. J.

    1990-05-01

    A comparison of the crack resistance (R curve), fracture toughness and instability behavior of candidate damage tolerant aluminum lithium alloys, 2091 and 8090, and the widely used conventional 2024-T3 alloy is addressed. The 2091 alloy was in three heat treatment conditions, T8X, TX and TY, all artificially aged. The 8090 alloy was in the T81 condition. The crack resistances and fracture toughnesses of 2091-T8X and 8091-T81 were similar to those of 2024-T3, but at a 50 MPa lower strength level. The crack resistances and fracture toughnesses of 2091-TX and 2091-TY were much inferior. In all cases, stable (slow) crack growth was ductile, but unstable crack growth in 2091-TX and 2091-TY was 100 percent intergranular and macroscopically brittle. Unstable crack growth in 2091-T8X was 50 percent intergranular and macroscopically ductile. Fractographic analysis indicated the 2091-TX and 2091-TY alloys to be sensitive to dynamic effects, such that the dynamic fracture toughness could be significantly lower than the quasistatic fracture toughness. This may also be true of 2091-T8X. These results point out that fail safe crack arrest tests should be included in any evaluation of damage tolerant Al-Li sheet alloys for aircraft structures.

  5. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires

    NASA Astrophysics Data System (ADS)

    Hohenwarter, A.; Völker, B.; Kapp, M. W.; Li, Y.; Goto, S.; Raabe, D.; Pippan, R.

    2016-09-01

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today’s strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material’s strength.

  6. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires

    PubMed Central

    Hohenwarter, A.; Völker, B.; Kapp, M. W.; Li, Y.; Goto, S.; Raabe, D.; Pippan, R.

    2016-01-01

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today’s strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material’s strength. PMID:27624220

  7. Insensitivity to Flaws Leads to Damage Tolerance in Brittle Architected Meta-Materials

    NASA Astrophysics Data System (ADS)

    Montemayor, L. C.; Wong, W. H.; Zhang, Y.-W.; Greer, J. R.

    2016-02-01

    Cellular solids are instrumental in creating lightweight, strong, and damage-tolerant engineering materials. By extending feature size down to the nanoscale, we simultaneously exploit the architecture and material size effects to substantially enhance structural integrity of architected meta-materials. We discovered that hollow-tube alumina nanolattices with 3D kagome geometry that contained pre-fabricated flaws always failed at the same load as the pristine specimens when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. Samples with (a/w) > 0.3, and notch length-to-unit cell size ratios of (a/l) > 5.2, failed at a lower peak loads because of the higher sample compliance when fewer unit cells span the intact region. Finite element simulations show that the failure is governed by purely tensile loading for (a/w) < 0.3 for the same (a/l); bending begins to play a significant role in failure as (a/w) increases. This experimental and computational work demonstrates that the discrete-continuum duality of architected structural meta-materials may give rise to their damage tolerance and insensitivity of failure to the presence of flaws even when made entirely of intrinsically brittle materials.

  8. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires.

    PubMed

    Hohenwarter, A; Völker, B; Kapp, M W; Li, Y; Goto, S; Raabe, D; Pippan, R

    2016-09-14

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today's strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material's strength.

  9. Strain rate and manufacturing technique effects on the damage tolerance of composite laminates

    NASA Astrophysics Data System (ADS)

    McManus, Hugh L.; Mak, Yew-Po

    1993-04-01

    The tensile failure and damage tolerance of graphite/epoxy laminates under uniaxial tensile loading at strain rates ranging from 0.0042 epsilon/min to 2 epsilon/min was studied experimentally. Two materials (IM7/977-2 and AS4/938) and two manufacturing methods (manual tape layup and automated tow placement) were used. The failure strengths, failure modes and laminate properties of both unnotched and notched specimens were measured. IM7/977-2 specimens were insensitive to strain rates, except for observed differences in the failure modes of the notched specimens. The unnotched response of the AS4/938 laminates was not dependent on strain rate or manufacturing technique, but their damage tolerance was dependent on both factors. Notched specimens were up to 20 percent weaker at the highest strain rates. Tow placed specimens were less notch sensitive than tape layup specimens, and also less sensitive to strain rates. Strain rates and manufacturing techniques appeared to affect the progression from initiation of failure at the notch tip to final failure.

  10. Ultra-strong and damage tolerant metallic bulk materials: A lesson from nanostructured pearlitic steel wires.

    PubMed

    Hohenwarter, A; Völker, B; Kapp, M W; Li, Y; Goto, S; Raabe, D; Pippan, R

    2016-01-01

    Structural materials used for safety critical applications require high strength and simultaneously high resistance against crack growth, referred to as damage tolerance. However, the two properties typically exclude each other and research efforts towards ever stronger materials are hampered by drastic loss of fracture resistance. Therefore, future development of novel ultra-strong bulk materials requires a fundamental understanding of the toughness determining mechanisms. As model material we use today's strongest metallic bulk material, namely, a nanostructured pearlitic steel wire, and measured the fracture toughness on micron-sized specimens in different crack growth directions and found an unexpected strong anisotropy in the fracture resistance. Along the wire axis the material reveals ultra-high strength combined with so far unprecedented damage tolerance. We attribute this excellent property combination to the anisotropy in the fracture toughness inducing a high propensity for micro-crack formation parallel to the wire axis. This effect causes a local crack tip stress relaxation and enables the high fracture toughness without being detrimental to the material's strength. PMID:27624220

  11. Probabilistic analysis of cascade failure dynamics in complex network

    NASA Astrophysics Data System (ADS)

    Zhang, Ding-Xue; Zhao, Dan; Guan, Zhi-Hong; Wu, Yonghong; Chi, Ming; Zheng, Gui-Lin

    2016-11-01

    The impact of initial load and tolerance parameter distribution on cascade failure is investigated. By using mean field theory, a probabilistic cascade failure model is established. Based on the model, the damage caused by certain attack size can be predicted, and the critical attack size is derived by the condition of cascade failure end, which ensures no collapse. The critical attack size is larger than the case of constant tolerance parameter for network of random distribution. Comparing three typical distributions, simulation results indicate that the network whose initial load and tolerance parameter both follow Weibull distribution performs better than others.

  12. Damage tolerance assessment of bonded composite doubler repairs for commercial aircraft applications

    SciTech Connect

    Roach, D.

    1998-08-01

    The Federal Aviation Administration has sponsored a project at its Airworthiness Assurance NDI Validation Center (AANC) to validate the use of bonded composite doublers on commercial aircraft. A specific application was chosen in order to provide a proof-of-concept driving force behind this test and analysis project. However, the data stemming from this study serves as a comprehensive evaluation of bonded composite doublers for general use. The associated documentation package provides guidance regarding the design, analysis, installation, damage tolerance, and nondestructive inspection of these doublers. This report describes a series of fatigue and strength tests which were conducted to study the damage tolerance of Boron-Epoxy composite doublers. Tension-tension fatigue and ultimate strength tests attempted to grow engineered flaws in coupons with composite doublers bonded to aluminum skin. An array of design parameters, including various flaw scenarios, the effects of surface impact, and other off-design conditions, were studied. The structural tests were used to: (1) assess the potential for interply delaminations and disbonds between the aluminum and the laminate, and (2) determine the load transfer and crack mitigation capabilities of composite doublers in the presence of severe defects. A series of specimens were subjected to ultimate tension tests in order to determine strength values and failure modes. It was demonstrated that even in the presence of extensive damage in the original structure (cracks, material loss) and in spite of non-optimum installations (adhesive disbonds), the composite doubler allowed the structure to survive more than 144,000 cycles of fatigue loading. Installation flaws in the composite laminate did not propagate over 216,000 fatigue cycles. Furthermore, the added impediments of impact--severe enough to deform the parent aluminum skin--and hot-wet exposure did not effect the doubler`s performance. Since the tests were conducting

  13. A damage tolerance comparison of IM7/8551 and IM8G/8553 carbon/epoxy composites

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1991-01-01

    A damage tolerance study of two new toughened carbon fiber/epoxy resin systems was undertaken as a continuation of ongoing work into screening new opposites for resistance to foreign object impact. This report is intended to be a supplement to NASA TP 3029 in which four new fiber/resin systems were tested for damage tolerance. Instrumented drop weight impact testing was used to inflict damage to 16-ply quasi-isotropic specimens. Instrumented output data and cross-sectional examinations of the damage zone were utilized to quantify the damage. It was found that the two fiber/resin systems tested in this study were much more impact resistant than an untoughened composite such as T300/934, but were not as impact resistant as other materials previously studied.

  14. Coordination of DNA damage tolerance mechanisms with cell cycle progression in fission yeast

    PubMed Central

    Callegari, A. John; Kelly, Thomas J.

    2016-01-01

    ABSTRACT DNA damage tolerance (DDT) mechanisms allow cells to synthesize a new DNA strand when the template is damaged. Many mutations resulting from DNA damage in eukaryotes are generated during DDT when cells use the mutagenic translesion polymerases, Rev1 and Polζ, rather than mechanisms with higher fidelity. The coordination among DDT mechanisms is not well understood. We used live-cell imaging to study the function of DDT mechanisms throughout the cell cycle of the fission yeast Schizosaccharomyces pombe. We report that checkpoint-dependent mitotic delay provides a cellular mechanism to ensure the completion of high fidelity DDT, largely by homology-directed repair (HDR). DDT by mutagenic polymerases is suppressed during the checkpoint delay by a mechanism dependent on Rad51 recombinase. When cells pass the G2/M checkpoint and can no longer delay mitosis, they completely lose the capacity for HDR and simultaneously exhibit a requirement for Rev1 and Polζ. Thus, DDT is coordinated with the checkpoint response so that the activity of mutagenic polymerases is confined to a vulnerable period of the cell cycle when checkpoint delay and HDR are not possible. PMID:26652183

  15. Coordination of DNA damage tolerance mechanisms with cell cycle progression in fission yeast.

    PubMed

    Callegari, A John; Kelly, Thomas J

    2016-01-01

    DNA damage tolerance (DDT) mechanisms allow cells to synthesize a new DNA strand when the template is damaged. Many mutations resulting from DNA damage in eukaryotes are generated during DDT when cells use the mutagenic translesion polymerases, Rev1 and Polζ, rather than mechanisms with higher fidelity. The coordination among DDT mechanisms is not well understood. We used live-cell imaging to study the function of DDT mechanisms throughout the cell cycle of the fission yeast Schizosaccharomyces pombe. We report that checkpoint-dependent mitotic delay provides a cellular mechanism to ensure the completion of high fidelity DDT, largely by homology-directed repair (HDR). DDT by mutagenic polymerases is suppressed during the checkpoint delay by a mechanism dependent on Rad51 recombinase. When cells pass the G2/M checkpoint and can no longer delay mitosis, they completely lose the capacity for HDR and simultaneously exhibit a requirement for Rev1 and Polζ. Thus, DDT is coordinated with the checkpoint response so that the activity of mutagenic polymerases is confined to a vulnerable period of the cell cycle when checkpoint delay and HDR are not possible. PMID:26652183

  16. Role of interfaces i nthe design of ultra-high strength, radiation damage tolerant nanocomposites

    SciTech Connect

    Misra, Amit; Wang, Yongqiang; Nastasi, Michael A; Baldwin, Jon K; Wei, Qiangmin; Li, Nan; Mara, Nathan; Zhang, Xinghang; Fu, Engang; Anderoglu, Osman; Li, Hongqi; Bhattacharyya, Dhriti

    2010-12-09

    The combination of high strength and high radiation damage tolerance in nanolaminate composites can be achieved when the individual layers in these composites are only a few nanometers thick and contain special interfaces that act both as obstacles to slip, as well as sinks for radiation-induced defects. The morphological and phase stabilities and strength and ductility of these nano-composites under ion irradiation are explored as a function of layer thickness, temperature and interface structure. Magnetron sputtered metallic multilayers such as Cu-Nb and V-Ag with a range of individual layer thickness from approximately 2 nm to 50 nm and the corresponding 1000 nm thick single layer films were implanted with helium ions at room temperature. Cross-sectional Transmission Electron Microscopy (TEM) was used to measure the distribution of helium bubbles and correlated with the helium concentration profile measured vis ion beam analysis techniques to obtain the helium concentration at which bubbles are detected in TEM. It was found that in multilayers the minimum helium concentration to form bubbles (approximately I nm in size) that are easily resolved in through-focus TEM imaging was several atomic %, orders of magnitude higher than that in single layer metal films. This observation is consistent with an increased solubility of helium at interfaces that is predicted by atomistic modeling of the atomic structures of fcc-bcc interfaces. At helium concentrations as high as 7 at.%, a uniform distribution of I nm diameter bubbles results in negligible irradiation hardening and loss of deformability in multi layers with layer thicknesses of a few nanometers. The control of atomic structures of interfaces to produce high helium solubility at interfaces is crucial in the design of nano-composite materials that are radiation damage tolerant. Reduced radiation damage also leads to a reduction in the irradiation hardening, particularly at layer thickness of approximately 5 run

  17. Genomic assay reveals tolerance of DNA damage by both translesion DNA synthesis and homology-dependent repair in mammalian cells.

    PubMed

    Izhar, Lior; Ziv, Omer; Cohen, Isadora S; Geacintov, Nicholas E; Livneh, Zvi

    2013-04-16

    DNA lesions can block replication forks and lead to the formation of single-stranded gaps. These replication complications are mitigated by DNA damage tolerance mechanisms, which prevent deleterious outcomes such as cell death, genomic instability, and carcinogenesis. The two main tolerance strategies are translesion DNA synthesis (TLS), in which low-fidelity DNA polymerases bypass the blocking lesion, and homology-dependent repair (HDR; postreplication repair), which is based on the homologous sister chromatid. Here we describe a unique high-resolution method for the simultaneous analysis of TLS and HDR across defined DNA lesions in mammalian genomes. The method is based on insertion of plasmids carrying defined site-specific DNA lesions into mammalian chromosomes, using phage integrase-mediated integration. Using this method we show that mammalian cells use HDR to tolerate DNA damage in their genome. Moreover, analysis of the tolerance of the UV light-induced 6-4 photoproduct, the tobacco smoke-induced benzo[a]pyrene-guanine adduct, and an artificial trimethylene insert shows that each of these three lesions is tolerated by both TLS and HDR. We also determined the specificity of nucleotide insertion opposite these lesions during TLS in human genomes. This unique method will be useful in elucidating the mechanism of DNA damage tolerance in mammalian chromosomes and their connection to pathological processes such as carcinogenesis. PMID:23530190

  18. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism

    PubMed Central

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A.; Paci, Jeffrey T.; Espinosa, Horacio D.

    2015-01-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials. PMID:26289729

  19. Plasticity and ductility in graphene oxide through a mechanochemically induced damage tolerance mechanism

    NASA Astrophysics Data System (ADS)

    Wei, Xiaoding; Mao, Lily; Soler-Crespo, Rafael A.; Paci, Jeffrey T.; Huang, Jiaxing; Nguyen, Sonbinh T.; Espinosa, Horacio D.

    2015-08-01

    The ability to bias chemical reaction pathways is a fundamental goal for chemists and material scientists to produce innovative materials. Recently, two-dimensional materials have emerged as potential platforms for exploring novel mechanically activated chemical reactions. Here we report a mechanochemical phenomenon in graphene oxide membranes, covalent epoxide-to-ether functional group transformations that deviate from epoxide ring-opening reactions, discovered through nanomechanical experiments and density functional-based tight binding calculations. These mechanochemical transformations in a two-dimensional system are directionally dependent, and confer pronounced plasticity and damage tolerance to graphene oxide monolayers. Additional experiments on chemically modified graphene oxide membranes, with ring-opened epoxide groups, verify this unique deformation mechanism. These studies establish graphene oxide as a two-dimensional building block with highly tuneable mechanical properties for the design of high-performance nanocomposites, and stimulate the discovery of new bond-selective chemical transformations in two-dimensional materials.

  20. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    NASA Astrophysics Data System (ADS)

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-01

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show how this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.

  1. Optimal Design and Damage Tolerance Verification of an Isogrid Structure for Helicopter Application

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.; Fudge, Jack; Ambur, Damodar R.; Kassapoglou, Christos

    2003-01-01

    A composite isogrid panel design for application to a rotorcraft fuselage is presented. An optimum panel design for the lower fuselage of the rotorcraft that is subjected to combined in-plane compression and shear loads was generated using a design tool that utilizes a smeared-stiffener theory in conjunction with a genetic algorithm. A design feature was introduced along the edges of the panel that facilitates introduction of loads into the isogrid panel without producing undesirable local bending gradients. A low-cost manufacturing method for the isogrid panel that incorporates these design details is also presented. Axial compression tests were conducted on the undamaged and low-speed impact damaged panels to demonstrate the damage tolerance of this isogrid panel. A combined loading test fixture was designed and utilized that allowed simultaneous application of compression and shear loads to the test specimen. Results from finite element analyses are presented for the isogrid panel designs and these results are compared with experimental results. This study illustrates the isogrid concept to be a viable candidate for application to the helicopter lower fuselage structure.

  2. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    SciTech Connect

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show how this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.

  3. Transparency and damage tolerance of patternable omniphobic lubricated surfaces based on inverse colloidal monolayers

    DOE PAGESBeta

    Vogel, Nicolas; Belisle, Rebecca A.; Hatton, Benjamin; Wong, Tak-Sing; Aizenberg, Joanna

    2013-07-31

    A transparent coating that repels a wide variety of liquids, prevents staining, is capable of self-repair and is robust towards mechanical damage can have a broad technological impact, from solar cell coatings to self-cleaning optical devices. Here we employ colloidal templating to design transparent, nanoporous surface structures. A lubricant can be firmly locked into the structures and, owing to its fluidic nature, forms a defect-free, self-healing interface that eliminates the pinning of a second liquid applied to its surface, leading to efficient liquid repellency, prevention of adsorption of liquid-borne contaminants, and reduction of ice adhesion strength. We further show howmore » this method can be applied to locally pattern the repellent character of the substrate, thus opening opportunities to spatially confine any simple or complex fluids. The coating is highly defect-tolerant due to its interconnected, honeycomb wall structure, and repellency prevails after the application of strong shear forces and mechanical damage. The regularity of the coating allows us to understand and predict the stability or failure of repellency as a function of lubricant layer thickness and defect distribution based on a simple geometric model.« less

  4. Damage tolerance and arrest characteristics of pressurized graphite/epoxy tape cylinders

    NASA Technical Reports Server (NTRS)

    Ranniger, Claudia U.; Lagace, Paul A.; Graves, Michael J.

    1993-01-01

    An investigation of the damage tolerance and damage arrest characteristics of internally-pressurized graphite/epoxy tape cylinders with axial notches was conducted. An existing failure prediction methodology, developed and verified for quasi-isotropic graphite/epoxy fabric cylinders, was investigated for applicability to general tape layups. In addition, the effect of external circumferential stiffening bands on the direction of fracture path propagation and possible damage arrest was examined. Quasi-isotropic (90/0/plus or minus 45)s and structurally anisotropic (plus or minus 45/0)s and (plus or minus 45/90)s coupons and cylinders were constructed from AS4/3501-6 graphite/epoxy tape. Notched and unnotched coupons were tested in tension and the data correlated using the equation of Mar and Lin. Cylinders with through-thickness axial slits were pressurized to failure achieving a far-field two-to-one biaxial stress state. Experimental failure pressures of the (90/0/plus or minus 45)s cylinders agreed with predicted values for all cases but the specimen with the smallest slit. However, the failure pressures of the structurally anisotropic cylinders, (plus or minus 45/0)s and (plus or minus 45/90)s, were above the values predicted utilizing the predictive methodology in all cases. Possible factors neglected by the predictive methodology include structural coupling in the laminates and axial loading of the cylindrical specimens. Furthermore, applicability of the predictive methodology depends on the similarity of initial fracture modes in the coupon specimens and the cylinder specimens of the same laminate type. The existence of splitting which may be exacerbated by the axial loading in the cylinders, shows that this condition is not always met. The circumferential stiffeners were generally able to redirect fracture propagation from longitudinal to circumferential. A quantitative assessment for stiffener effectiveness in containing the fracture, based on cylinder

  5. Structurally Integrated, Damage Tolerant Thermal Spray Coatings: Processing Effects on Surface and System Functionalities

    NASA Astrophysics Data System (ADS)

    Vackel, Andrew

    Thermal Spray (TS) coatings have seen extensive application as protective surfaces to enhance the service life of substrates prone to damage in their operating environment (wear, corrosion, heat etc.). With the advent of high velocity TS processes, the ability to deposit highly dense (>99%) metallic and cermet coatings has further enhanced the protective ability of these coatings. In addition to surface functionality, the influence of the coating application on the mechanical performance of a coated component is of great concern when such a component will experience either static or cyclic loading during service. Using a process mapping methodology, the processing-property interplay between coating materials meant to provide damage tolerant surface or for structural restoration are explored in terms of relevant mechanical properties. Most importantly, the residual stresses inherent in TS deposited coatings are shown to play a significant role in the integrated mechanical performance of these coatings. Unique to high velocity TS processes is the ability to produce compressive stresses within the deposit from the cold working induced by the high kinetic energy particles upon impact. The extent of these formation stresses are explored with different coating materials, as well as processing influence. The ability of dense TS coatings to carry significant structural load and synergistically strengthen coated tensile specimens is demonstrated as a function of coating material, processing, and thickness. The sharing of load between the substrate and otherwise brittle coating enables higher loads before yield for the bi-material specimens, offering a methodology to improve the tensile performance of coated components for structural repair or multi-functionality (surface and structure). The concern of cyclic fatigue damage in coated components is explored, since the majority of service application are designed for loading to be well below the yield point. The role of

  6. Integration of Principles of Systems Biology and Radiation Biology: Toward Development of in silico Models to Optimize IUdR-Mediated Radiosensitization of DNA Mismatch Repair Deficient (Damage Tolerant) Human Cancers

    PubMed Central

    Kinsella, Timothy J.; Gurkan-Cavusoglu, Evren; Du, Weinan; Loparo, Kenneth A.

    2011-01-01

    Over the last 7 years, we have focused our experimental and computational research efforts on improving our understanding of the biochemical, molecular, and cellular processing of iododeoxyuridine (IUdR) and ionizing radiation (IR) induced DNA base damage by DNA mismatch repair (MMR). These coordinated research efforts, sponsored by the National Cancer Institute Integrative Cancer Biology Program (ICBP), brought together system scientists with expertise in engineering, mathematics, and complex systems theory and translational cancer researchers with expertise in radiation biology. Our overall goal was to begin to develop computational models of IUdR- and/or IR-induced base damage processing by MMR that may provide new clinical strategies to optimize IUdR-mediated radiosensitization in MMR deficient (MMR−) “damage tolerant” human cancers. Using multiple scales of experimental testing, ranging from purified protein systems to in vitro (cellular) and to in vivo (human tumor xenografts in athymic mice) models, we have begun to integrate and interpolate these experimental data with hybrid stochastic biochemical models of MMR damage processing and probabilistic cell cycle regulation models through a systems biology approach. In this article, we highlight the results and current status of our integration of radiation biology approaches and computational modeling to enhance IUdR-mediated radiosensitization in MMR− damage tolerant cancers. PMID:22649757

  7. Fuel containment and damage tolerance in large composite primary aircraft structures. Phase 2: Testing

    NASA Technical Reports Server (NTRS)

    Sandifer, J. P.; Denny, A.; Wood, M. A.

    1985-01-01

    Technical issues associated with fuel containment and damage tolerance of composite wing structures for transport aircraft were investigated. Material evaluation tests were conducted on two toughened resin composites: Celion/HX1504 and Celion/5245. These consisted of impact, tension, compression, edge delamination, and double cantilever beam tests. Another test series was conducted on graphite/epoxy box beams simulating a wing cover to spar cap joint configuration of a pressurized fuel tank. These tests evaluated the effectiveness of sealing methods with various fastener types and spacings under fatigue loading and with pressurized fuel. Another test series evaluated the ability of the selected coatings, film, and materials to prevent fuel leakage through 32-ply AS4/2220-1 laminates at various impact energy levels. To verify the structural integrity of the technology demonstration article structural details, tests were conducted on blade stiffened panels and sections. Compression tests were performed on undamaged and impacted stiffened AS4/2220-1 panels and smaller element tests to evaluate stiffener pull-off, side load and failsafe properties. Compression tests were also performed on panels subjected to Zone 2 lightning strikes. All of these data were integrated into a demonstration article representing a moderately loaded area of a transport wing. This test combined lightning strike, pressurized fuel, impact, impact repair, fatigue and residual strength.

  8. Damage Tolerance Assessment of Friction Pull Plug Welds in an Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Burkholder, Jonathan

    2012-01-01

    Friction stir welding is a solid state welding process used in the fabrication of cryogenic propellant tanks. Self-reacting friction stir welding is one variation of the friction stir weld process being developed for manufacturing tanks. Friction pull plug welding is used to seal the exit hole that remains in a circumferential self-reacting friction stir weld. A friction plug weld placed in a self-reacting friction stir weld results in a non-homogenous weld joint where the initial weld, plug weld, their respective heat affected zones and the base metal all interact. The welded joint is a composite plastically deformed material system with a complex residual stress field. In order to address damage tolerance concerns associated with friction plug welds in safety critical structures, such as propellant tanks, nondestructive inspection and proof testing may be required to screen hardware for mission critical defects. The efficacy of the nondestructive evaluation or the proof test is based on an assessment of the critical flaw size. Test data relating residual strength capability to flaw size in an aluminum alloy friction plug weld will be presented.

  9. DNA Damage Tolerance and a Web of Connections with DNA Repair at Yale

    PubMed Central

    Wood, Richard D.

    2013-01-01

    This short article summarizes some of the research carried out recently by my laboratory colleagues on the function of DNA polymerase zeta (polζ) in mammalian cells. Some personal background is also described, relevant to research associations with Yale University and its continuing influence. Polζ is involved in the bypass of many DNA lesions by translesion DNA synthesis and is responsible for the majority of DNA damage-induced point mutagenesis in mammalian cells (including human cells), as well as in yeast. We also found that the absence of this enzyme leads to gross chromosomal instability in mammalian cells and increased spontaneous tumorigenesis in mice. Recently, we discovered a further unexpectedly critical role for polζ: it plays an essential role in allowing continued rapid proliferation of cells and tissues. These observations and others indicate that polζ engages frequently during DNA replication to bypass and tolerate DNA lesions or unusual DNA structures that are barriers for the normal DNA replication machinery. PMID:24348215

  10. Aerothermal performance and damage tolerance of a Rene 41 metallic standoff thermal protection system at Mach 6.7

    NASA Technical Reports Server (NTRS)

    Avery, D. E.

    1984-01-01

    A flight-weight, metallic thermal protection system (TPS) model applicable to Earth-entry and hypersonic-cruise vehicles was subjected to multiple cycles of both radiant and aerothermal heating in order to evaluate its aerothermal performance, structural integrity, and damage tolerance. The TPS was designed for a maximum operating temperature of 2060 R and featured a shingled, corrugation-stiffened corrugated-skin heat shield of Rene 41, a nickel-base alloy. The model was subjected to 10 radiant heating tests and to 3 radiant preheat/aerothermal tests. Under radiant-heating conditions with a maximum surface temperature of 2050 R, the TPS performed as designed and limited the primary structure away from the support ribs to temperatures below 780 R. During the first attempt at aerothermal exposure, a failure in the panel-holder test fixture severely damaged the model. However, two radiant preheat/aerothermal tests were made with the damaged model to test its damage tolerance. During these tests, the damaged area did not enlarge; however, the rapidly increasing structural temperature measuring during these tests indicates that had the damaged area been exposed to aerodynamic heating for the entire trajectory, an aluminum burn-through would have occurred.

  11. Damage tolerant functionally graded materials for advanced wear and friction applications

    NASA Astrophysics Data System (ADS)

    Prchlik, Lubos

    The research work presented in this dissertation focused on processing effects, microstructure development, characterization and performance evaluation of composite and graded coatings used for friction and wear control. The following issues were addressed. (1) Definition of prerequisites for a successful composite and graded coating formation by means of thermal spraying. (2) Improvement of characterization methods available for homogenous thermally sprayed coating and their extension to composite and graded materials. (3) Development of novel characterization methods specifically for FGMs, with a focus on through thickness property measurement by indentation and in-situ curvature techniques. (4) Design of composite materials with improved properties compared to homogenous coatings. (5) Fabrication and performance assessment of FGM with improved wear and impact damage properties. Materials. The materials studied included several material systems relevant to low friction and contact damage tolerant applications: MO-Mo2C, WC-Co cermets as materials commonly used sliding components of industrial machinery and NiCrAlY/8%-Yttria Partially Stabilized Zirconia composites as a potential solution for abradable sections of gas turbines and aircraft engines. In addition, uniform coatings such as molybdenum and Ni5%Al alloy were evaluated as model system to assess the influence of microstructure variation onto the mechanical property and wear response. Methods. The contact response of the materials was investigated through several techniques. These included methods evaluating the relevant intrinsic coating properties such as elastic modulus, residual stress, fracture toughness, scratch resistance and tests measuring the abrasion and friction-sliding behavior. Dry-sand and wet two-body abrasion testing was performed in addition to traditional ball on disc sliding tests. Among all characterization techniques the spherical indentation deserved most attention and enabled to

  12. Decreased drug accumulation and increased tolerance to DNA damage in tumor cells with a low level of cisplatin resistance.

    PubMed

    Lanzi, C; Perego, P; Supino, R; Romanelli, S; Pensa, T; Carenini, N; Viano, I; Colangelo, D; Leone, R; Apostoli, P; Cassinelli, G; Gambetta, R A; Zunino, F

    1998-04-15

    In an attempt to examine the cellular changes associated with cisplatin resistance, we selected a cisplatin-resistant (A43 1/Pt) human cervix squamous cell carcinoma cell line following continuous in vitro drug exposure. The resistant subline was characterized by a 2.5-fold degree of resistance. In particular, we investigated the expression of cellular defence systems and other cellular factors probably involved in dealing with cisplatin-induced DNA damage. Resistant cells exhibited decreased platinum accumulation and reduced levels of DNA-bound platinum and interstrand cross-link frequency after short-term drug exposure. Analysis of the effect of cisplatin on cell cycle progression revealed a cisplatin-induced G2M arrest in sensitive and resistant cells. Interestingly, a slowdown in S-phase transit was found in A431/Pt cells. A comparison of the ability of sensitive and resistant cells to repair drug-induced DNA damage suggested that resistant cells were able to tolerate higher levels of cisplatin-induced DNA damage than their parental counterparts. Analysis of the expression of proteins involved in DNA mismatch repair showed a decreased level of MSH2 in resistant cells. Since MSH2 seems to be involved in recognition of drug-induced DNA damage, this change may account for the increased tolerance to DNA damage observed in the resistant subline. In conclusion, the involvement of accumulation defects and the increased tolerance to cisplatin-induced DNA damage in these cisplatin-resistant cells support the notion that multiple changes contribute to confer a low level of cisplatin resistance. PMID:9719480

  13. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  14. Resistance and tolerance of Terminalia sericea trees to simulated herbivore damage under different soil nutrient and moisture conditions.

    PubMed

    Katjiua, Mutjinde L J; Ward, David

    2006-07-01

    Resource availability, degree of herbivore damage, genetic variability, and their interactions influence the allocation of investment by plants to resistance and tolerance traits. We evaluated the independent and interactive effects of soil nutrients and moisture, and simulated the effects of herbivore damage on condensed tannins (resistance) and growth/regrowth (tolerance) traits of Terminalia sericea, a deciduous tree in the Kalahari desert that constitutes a major component of livestock diet. We used a completely crossed randomized-block design experiment to examine the effects of nutrients, water availability, and herbivore damage on regrowth and resistance traits of T. sericea seedlings. Plant height, number of branches, internode length, leaf area, leaf mass for each seedling, combined weight of stems and twigs, and root mass were recorded. Condensed tannin concentrations were 22.5 and 21.5% higher under low nutrients and low soil moisture than under high nutrient and high water treatment levels. Tannin concentrations did not differ significantly between control and experimental seedlings 2 mo after simulated herbivore damage. Tannin concentrations correlated more strongly with growth traits under low- than under high-nutrient conditions. No trade-offs were detected among individual growth traits, nor between growth traits and condensed tannins. T. sericea appeared to invest more in both resistance and regrowth traits when grown under low-nutrient conditions. Investment in the resistance trait (condensed tannin) under high-nutrient conditions was minimal and, to a lesser degree, correlated with plant growth. These results suggest that T. sericea displays both resistance and tolerance strategies, and that the degree to which each is expressed is resource-dependent.

  15. Absence of tolerance to the anticonvulsant and neuroprotective effects of imidazenil against DFP-induced seizure and neuronal damage.

    PubMed

    Kadriu, Bashkim; Gocel, James; Larson, John; Guidotti, Alessandro; Davis, John M; Nambiar, Madhusoodana P; Auta, James

    2011-12-01

    The clinical use of diazepam or midazolam to control organophosphate (OP) nerve agent-induced seizure activity is limited by their unwanted effects including sedation, amnesia, withdrawal, and anticonvulsant tolerance. Imidazenil is an imidazo-benzodiazepine derivative with high intrinsic efficacy and selectivity for α2-, α3-, and α5- but low intrinsic efficacy for α1-containing GABA(A) receptors. We have previously shown that imidazenil is more efficacious than diazepam at protecting rats and mice from diisopropyl fluorophosphate (DFP)-induced seizures and neuronal damage without producing sedation. In the present study, we compared the tolerance liability of imidazenil and diazepam to attenuate the seizure activity and neurotoxic effects of DFP. Rats received protracted (14 days) oral treatment with increasing doses of imidazenil (1-4 mg/kg), diazepam (5-20 mg/kg), or vehicle. Eighteen hours after the last dose of the protracted treatment schedule, rats were tested for anticonvulsant tolerance after a 30 min pretreatment with a single test dose of imidazenil (0.5 mg/kg) or diazepam (5 mg/kg) prior to a DFP challenge (1.5 mg/kg). The anticonvulsant (modified Racine score scale) and neuroprotective (fluoro-jade B staining) effects of diazepam were significantly reduced in protracted diazepam-treated animals whereas the effects of imidazenil were not altered in protracted imidazenil-treated animals. The present findings indicate that protracted imidazenil treatment does not produce tolerance to its protective action against the neurotoxic effects of OP exposure.

  16. Design, analysis, and fabrication of a pressure box test fixture for tension damage tolerance testing of curved fuselage panels

    NASA Technical Reports Server (NTRS)

    Smith, P. J.; Bodine, J. B.; Preuss, C. H.; Koch, W. J.

    1993-01-01

    A pressure box test fixture was designed and fabricated to evaluate the effects of internal pressure, biaxial tension loads, curvature, and damage on the fracture response of composite fuselage structure. Previous work in composite fuselage tension damage tolerance, performed during NASA contract NAS1-17740, evaluated the above effects on unstiffened panels only. This work extends the tension damage tolerance testing to curved stiffened fuselage crown structure that contains longitudinal stringers and circumferential frame elements. The pressure box fixture was designed to apply internal pressure up to 20 psi, and axial tension loads up to 5000 lb/in, either separately or simultaneously. A NASTRAN finite element model of the pressure box fixture and composite stiffened panel was used to help design the test fixture, and was compared to a finite element model of a full composite stiffened fuselage shell. This was done to ensure that the test panel was loaded in a similar way to a panel in the full fuselage shell, and that the fixture and its attachment plates did not adversely affect the panel.

  17. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo.

    PubMed

    Yang, Yang; Poe, Jonathan C; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-05-19

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18(-/-) mice. Moreover, primary Rad18(-/-) mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18(-/-) HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18(-/-) mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting. PMID:26883629

  18. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo.

    PubMed

    Yang, Yang; Poe, Jonathan C; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-05-19

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18(-/-) mice. Moreover, primary Rad18(-/-) mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18(-/-) HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18(-/-) mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting.

  19. Rad18 confers hematopoietic progenitor cell DNA damage tolerance independently of the Fanconi Anemia pathway in vivo

    PubMed Central

    Yang, Yang; Poe, Jonathan C.; Yang, Lisong; Fedoriw, Andrew; Desai, Siddhi; Magnuson, Terry; Li, Zhiguo; Fedoriw, Yuri; Araki, Kimi; Gao, Yanzhe; Tateishi, Satoshi; Sarantopoulos, Stefanie; Vaziri, Cyrus

    2016-01-01

    In cultured cancer cells the E3 ubiquitin ligase Rad18 activates Trans-Lesion Synthesis (TLS) and the Fanconi Anemia (FA) pathway. However, physiological roles of Rad18 in DNA damage tolerance and carcinogenesis are unknown and were investigated here. Primary hematopoietic stem and progenitor cells (HSPC) co-expressed RAD18 and FANCD2 proteins, potentially consistent with a role for Rad18 in FA pathway function during hematopoiesis. However, hematopoietic defects typically associated with fanc-deficiency (decreased HSPC numbers, reduced engraftment potential of HSPC, and Mitomycin C (MMC) -sensitive hematopoiesis), were absent in Rad18−/− mice. Moreover, primary Rad18−/− mouse embryonic fibroblasts (MEF) retained robust Fancd2 mono-ubiquitination following MMC treatment. Therefore, Rad18 is dispensable for FA pathway activation in untransformed cells and the Rad18 and FA pathways are separable in hematopoietic cells. In contrast with responses to crosslinking agents, Rad18−/− HSPC were sensitive to in vivo treatment with the myelosuppressive agent 7,12 Dimethylbenz[a]anthracene (DMBA). Rad18-deficient fibroblasts aberrantly accumulated DNA damage markers after DMBA treatment. Moreover, in vivo DMBA treatment led to increased incidence of B cell malignancy in Rad18−/− mice. These results identify novel hematopoietic functions for Rad18 and provide the first demonstration that Rad18 confers DNA damage tolerance and tumor-suppression in a physiological setting. PMID:26883629

  20. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., corrosion, or accidental damage. Repeated load and static analyses supported by test evidence and (if... accidental damage. Repeated load and static analyses supported by test evidence and (if available) service... catastrophic failure of the airplane; and (iii) An analysis, supported by test evidence, of the...

  1. 14 CFR 25.571 - Damage-tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... which could cause catastrophic failure of the airplane; and (iii) An analysis, supported by test.... Inspection thresholds for the following types of structure must be established based on crack growth analyses... locations and modes of damage due to fatigue, corrosion, or accidental damage. Repeated load and...

  2. GENETIC AND MOLECULAR ANALYSIS OF DNA DAMAGE REPAIR AND TOLERANCE PATHWAYS.

    SciTech Connect

    SUTHERLAND, B.M.

    2001-07-26

    Radiation can damage cellular components, including DNA. Organisms have developed a panoply of means of dealing with DNA damage. Some repair paths have rather narrow substrate specificity (e.g. photolyases), which act on specific pyrimidine photoproducts in a specific type (e.g., DNA) and conformation (double-stranded B conformation) of nucleic acid. Others, for example, nucleotide excision repair, deal with larger classes of damages, in this case bulky adducts in DNA. A detailed discussion of DNA repair mechanisms is beyond the scope of this article, but one can be found in the excellent book of Friedberg et al. [1] for further detail. However, some DNA damages and paths for repair of those damages important for photobiology will be outlined below as a basis for the specific examples of genetic and molecular analysis that will be presented below.

  3. Desiccation sensitivity and tolerance in the moss Physcomitrella patens: assessing limits and damage.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The moss Physcomitrella patens is becoming the model of choice for functional genomic studies at the cellular level. Studies report that P. patens survives moderate osmotic and salt stress, and that desiccation tolerance can be induced by exogenous ABA. Our goal was to quantify the extent of dehydr...

  4. Honey bee (Apis mellifera) drones survive oxidative stress due to increased tolerance instead of avoidance or repair of oxidative damage.

    PubMed

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K; Tarpy, David R; Rueppell, Olav

    2016-10-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees.

  5. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression.

    PubMed

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-07-26

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker-induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress. PMID:27407148

  6. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression

    PubMed Central

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F.; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-01-01

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker–induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress. PMID:27407148

  7. DNA damage tolerance pathway involving DNA polymerase ι and the tumor suppressor p53 regulates DNA replication fork progression.

    PubMed

    Hampp, Stephanie; Kiessling, Tina; Buechle, Kerstin; Mansilla, Sabrina F; Thomale, Jürgen; Rall, Melanie; Ahn, Jinwoo; Pospiech, Helmut; Gottifredi, Vanesa; Wiesmüller, Lisa

    2016-07-26

    DNA damage tolerance facilitates the progression of replication forks that have encountered obstacles on the template strands. It involves either translesion DNA synthesis initiated by proliferating cell nuclear antigen monoubiquitination or less well-characterized fork reversal and template switch mechanisms. Herein, we characterize a novel tolerance pathway requiring the tumor suppressor p53, the translesion polymerase ι (POLι), the ubiquitin ligase Rad5-related helicase-like transcription factor (HLTF), and the SWI/SNF catalytic subunit (SNF2) translocase zinc finger ran-binding domain containing 3 (ZRANB3). This novel p53 activity is lost in the exonuclease-deficient but transcriptionally active p53(H115N) mutant. Wild-type p53, but not p53(H115N), associates with POLι in vivo. Strikingly, the concerted action of p53 and POLι decelerates nascent DNA elongation and promotes HLTF/ZRANB3-dependent recombination during unperturbed DNA replication. Particularly after cross-linker-induced replication stress, p53 and POLι also act together to promote meiotic recombination enzyme 11 (MRE11)-dependent accumulation of (phospho-)replication protein A (RPA)-coated ssDNA. These results implicate a direct role of p53 in the processing of replication forks encountering obstacles on the template strand. Our findings define an unprecedented function of p53 and POLι in the DNA damage response to endogenous or exogenous replication stress.

  8. The key regulator of submergence tolerance, SUB1A, promotes photosynthetic and metabolic recovery from submergence damage in rice leaves.

    PubMed

    Alpuerto, Jasper Benedict; Hussain, Rana Muhammad Fraz; Fukao, Takeshi

    2016-03-01

    The submergence-tolerance regulator, SUBMERGENCE1A (SUB1A), of rice (Oryza sativa L.) modulates gene regulation, metabolism and elongation growth during submergence. Its benefits continue during desubmergence through protection from reactive oxygen species and dehydration, but there is limited understanding of SUB1A's role in physiological recovery from the stress. Here, we investigated the contribution of SUB1A to desubmergence recovery using the two near-isogenic lines, submergence-sensitive M202 and tolerant M202(Sub1). No visible damage was detected in the two genotypes after 3 d of submergence, but the sublethal stress differentially altered photosynthetic parameters and accumulation of energy reserves. Submergence inhibited photosystem II photochemistry and stimulated breakdown of protein and accumulation of several amino acids in both genotypes at similar levels. Upon desubmergence, however, more rapid return to homeostasis of these factors was observed in M202(Sub1). Submergence considerably restrained non-photochemical quenching (NPQ) in M202, whereas the value was unaltered in M202(Sub1) during the stress. Upon reaeration, submerged plants encounter sudden exposure to higher light. A greater capability for NPQ-mediated photoprotection can benefit the rapid recovery of photosynthetic performance and energy reserve metabolism in M202(Sub1). Our findings illuminate the significant role of SUB1A in active physiological recovery upon desubmergence, a component of enhanced tolerance to submergence. PMID:26477688

  9. Concerted and differential actions of two enzymatic domains underlie Rad5 contributions to DNA damage tolerance.

    PubMed

    Choi, Koyi; Batke, Sabrina; Szakal, Barnabas; Lowther, Jonathan; Hao, Fanfan; Sarangi, Prabha; Branzei, Dana; Ulrich, Helle D; Zhao, Xiaolan

    2015-03-11

    Many genome maintenance factors have multiple enzymatic activities. In most cases, how their distinct activities functionally relate with each other is unclear. Here we examined the conserved budding yeast Rad5 protein that has both ubiquitin ligase and DNA helicase activities. The Rad5 ubiquitin ligase activity mediates PCNA poly-ubiquitination and subsequently recombination-based DNA lesion tolerance. Interestingly, the ligase domain is embedded in a larger helicase domain comprising seven consensus motifs. How features of the helicase domain influence ligase function is controversial. To clarify this issue, we use genetic, 2D gel and biochemical analyses and show that a Rad5 helicase motif important for ATP binding is also required for PCNA poly-ubiquitination and recombination-based lesion tolerance. We determine that this requirement is due to a previously unrecognized contribution of the motif to the PCNA and ubiquitination enzyme interaction, and not due to its canonical role in supporting helicase activity. We further show that Rad5's helicase-mediated contribution to replication stress survival is separable from recombination. These findings delineate how two Rad5 enzymatic domains concertedly influence PCNA modification, and unveil their discrete contributions to stress tolerance.

  10. Micro(mi) RNA-34a targets protein phosphatase (PP)1γ to regulate DNA damage tolerance

    PubMed Central

    Takeda, Yuko; Venkitaraman, Ashok R

    2015-01-01

    The DNA damage response (DDR) triggers widespread changes in gene expression, mediated partly by alterations in micro(mi) RNA levels, whose nature and significance remain uncertain. Here, we report that miR-34a, which is upregulated during the DDR, modulates the expression of protein phosphatase 1γ (PP1γ) to regulate cellular tolerance to DNA damage. Multiple bio-informatic algorithms predict that miR-34a targets the PP1CCC gene encoding PP1γ protein. Ionising radiation (IR) decreases cellular expression of PP1γ in a dose-dependent manner. An miR-34a-mimic reduces cellular PP1γ protein. Conversely, an miR-34a inhibitor antagonizes IR-induced decreases in PP1γ protein expression. A wild-type (but not mutant) miR-34a seed match sequence from the 3′ untranslated region (UTR) of PP1CCC when transplanted to a luciferase reporter gene makes it responsive to an miR-34a-mimic. Thus, miR-34a upregulation during the DDR targets the 3′ UTR of PP1CCC to decrease PP1γ protein expression. PP1γ is known to antagonize DDR signaling via the ataxia-telangiectasia-mutated (ATM) kinase. Interestingly, we find that cells exposed to DNA damage become more sensitive – in an miR-34a-dependent manner – to a second challenge with damage. Increased sensitivity to the second challenge is marked by enhanced phosphorylation of ATM and p53, increased γH2AX formation, and increased cell death. Increased sensitivity can be partly recapitulated by a miR-34a-mimic, or antagonized by an miR-34a-inhibitor. Thus, our findings suggest a model in which damage-induced miR-34a induction reduces PP1γ expression and enhances ATM signaling to decrease tolerance to repeated genotoxic challenges. This mechanism has implications for tumor suppression and the response of cancers to therapeutic radiation. PMID:26111201

  11. Aldehyde dehydrogenase 2 protects human umbilical vein endothelial cells against oxidative damage and increases endothelial nitric oxide production to reverse nitroglycerin tolerance.

    PubMed

    Hu, X Y; Fang, Q; Ma, D; Jiang, L; Yang, Y; Sun, J; Yang, C; Wang, J S

    2016-06-10

    Medical nitroglycerin (glyceryl trinitrate, GTN) use is limited principally by tolerance typified by a decrease in nitric oxide (NO) produced by biotransformation. Such tolerance may lead to endothelial dysfunction by inducing oxidative stress. In vivo studies have demonstrated that aldehyde dehydrogenase 2 (ALDH2) plays important roles in GTN biotransformation and tolerance. Thus, modification of ALDH2 expression represents a potentially effective strategy to prevent and reverse GTN tolerance and endothelial dysfunction. In this study, a eukaryotic expression vector containing the ALDH2 gene was introduced into human umbilical vein endothelial cells (HUVECs) by liposome-mediated transfection. An indirect immunofluorescence assay showed that ALDH2 expression increased 24 h after transfection. Moreover, real-time polymerase chain reaction and western blotting revealed significantly higher ALDH2 mRNA and protein expression in the gene-transfected group than in the two control groups. GTN tolerance was induced by treating HUVECs with 10 mM GTN for 16 h + 10 min, which significantly decreased NO levels in control cells, but not in those transfected with ALDH2. Overexpression of ALDH2 increased cell survival against GTN-induced cytotoxicity and conferred protection from oxidative damage resulting from nitrate tolerance, accompanied by decreased production of intracellular reactive oxygen species and reduced expression of heme oxygenase 1. Furthermore, ALDH2 overexpression promoted Akt phosphorylation under GTN tolerance conditions. ALDH2 gene transfection can reverse and prevent tolerance to GTN through its bioactivation and protect against oxidative damage, preventing the development of endothelial dysfunction.

  12. A damage tolerance comparison of 7075-T6 aluminum alloy and IM7/977-2 carbon/epoxy

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.; Hodge, Andrew J.

    1991-01-01

    A comparison of low velocity impact damage between one of the strongest aluminum alloys, to a new, damage tolerant resin system as a matrix for high strength carbon fibers was examined in this study. The aluminum and composite materials were used as face sheets on a 0.13 g/cu cm aluminum honeycomb. Four levels of impact energy were used; 2.6 J, 5.3 J, 7.8 J and 9.9 J. The beams were compared for static strength and fatique life by use of the four-point bend flexure test. It was found that in the undamaged state the specific strength of the composite face sheets was about twice that of the aluminum face sheets. A sharp drop in strength was observed for the composite specimens impacted at the lowest (2.6J) energy level, but the overall specific strength was still higher than for the aluminum specimens. At all impact energy levels tested, the static specific strength of the composite face sheets were significantly higher than the aluminum face sheets. The fatigue life of the most severely damaged composite specimen was about 17 times greater than the undamaged aluminum specimens when cycled at 1 Hz between 20 percent and 85 percent of ultimate breaking load.

  13. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... damage can be expected to occur. The evaluation must incorporate repeated load and static analyses... demonstrated by tests, or by analysis supported by tests, that the structure is capable of carrying ultimate... analysis supported by tests. (3) The structure must be shown by residual strength tests, or...

  14. 14 CFR 23.573 - Damage tolerance and fatigue evaluation of structure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... damage can be expected to occur. The evaluation must incorporate repeated load and static analyses... demonstrated by tests, or by analysis supported by tests, that the structure is capable of carrying ultimate... analysis supported by tests. (3) The structure must be shown by residual strength tests, or...

  15. Damage tolerance in filament-wound graphite/epoxy pressure vessels

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Ngueyen, Vinh D.; Chenna, Ravi K.

    1995-01-01

    Graphite/epoxy composites are extensively used in the aerospace and sporting goods industries due to their superior engineering properties compared to those of metals. However, graphite/epoxy is extremely susceptible to impact damage which can cause considerable and sometimes undetected reduction in strength. An inelastic impact model was developed to predict damage due to low-velocity impact. A transient dynamic finite element formulation was used in conjunction with the 3D Tsai-Wu failure criterion to determine and incorporate failure in the materials during impact. Material degradation can be adjusted from no degradation to partial degradation to full degradation. The developed software is based on an object-oriented implementation framework called Extensible Implementation Framework for Finite Elements (EIFFE).

  16. Recent development in the design, testing and impact-damage tolerance of stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Williams, J. G.; Anderson, M. S.; Rhodes, M. D.; Starnes, J. H., Jr.; Stroud, W. J.

    1979-01-01

    Structural technology of laminated filamentary-composite stiffened-panel structures under combined inplane and lateral loadings is discussed. Attention is focused on: (1) methods for analyzing the behavior of these structures under load and for determining appropriate structural proportions for weight-efficient configurations; and (2) effects of impact damage and geometric imperfections on structural performance. Recent improvements in buckling analysis involving combined inplane compression and shear loadings and transverse shear deformations are presented. A computer code is described for proportioning or sizing laminate layers and cross-sectional dimensions, and the code is used to develop structural efficiency data for a variety of configurations, loading conditions, and constraint conditions. Experimental data on buckling of panels under inplane compression is presented. Mechanisms of impact damage initiation and propagation are described.

  17. Simplification of Fatigue Test Requirements for Damage Tolerance of Composite Interstage Launch Vehicle Hardware

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2010-01-01

    The issue of fatigue loading of structures composed of composite materials is considered in a requirements document that is currently in place for manned launch vehicles. By taking into account the short life of these parts, coupled with design considerations, it is demonstrated that the necessary coupon level fatigue data collapse to a static case. Data from a literature review of past studies that examined compressive fatigue loading after impact and data generated from this experimental study are presented to support this finding. Damage growth, in the form of infrared thermography, was difficult to detect due to rapid degradation of compressive properties once damage growth initiated. Unrealistically high fatigue amplitudes were needed to fail 5 of 15 specimens before 10,000 cycles were reached. Since a typical vehicle structure, such as the Ares I interstage, only experiences a few cycles near limit load, it is concluded that static compression after impact (CAI) strength data will suffice for most launch vehicle structures.

  18. An examination of the damage tolerance enhancement of carbon/epoxy using an outer lamina of spectra (R)

    NASA Technical Reports Server (NTRS)

    Lance, D. G.; Nettles, A. T.

    1991-01-01

    Low velocity instrumented impact testing was utilized to examine the effects of an outer lamina of ultra-high molecular weight polyethylene (Spectra) on the damage tolerance of carbon epoxy composites. Four types of 16-ply quasi-isotropic panels (0, +45, 90, -45) were tested. Some panels contained no Spectra, while others had a lamina of Spectra bonded to the top (impacted side), bottom, or both sides of the composite plates. The specimens were impacted with energies up to 8.5 J. Force time plots and maximum force versus impact energy graphs were generated for comparison purposes. Specimens were also subjected to cross-sectional analysis and compression after impact tests. The results show that while the Spectra improved the maximum load that the panels could withstand before fiber breakage, the Spectra seemingly reduced the residual strength of the composites.

  19. Honey bee (Apis mellifera) drones survive oxidative stress due to increased tolerance instead of avoidance or repair of oxidative damage.

    PubMed

    Li-Byarlay, Hongmei; Huang, Ming Hua; Simone-Finstrom, Michael; Strand, Micheline K; Tarpy, David R; Rueppell, Olav

    2016-10-01

    Oxidative stress can lead to premature aging symptoms and cause acute mortality at higher doses in a range of organisms. Oxidative stress resistance and longevity are mechanistically and phenotypically linked; considerable variation in oxidative stress resistance exists among and within species and typically covaries with life expectancy. However, it is unclear whether stress-resistant, long-lived individuals avoid, repair, or tolerate molecular damage to survive longer than others. The honey bee (Apis mellifera L.) is an emerging model system that is well-suited to address this question. Furthermore, this species is the most economically important pollinator, whose health may be compromised by pesticide exposure, including oxidative stressors. Here, we develop a protocol for inducing oxidative stress in honey bee males (drones) via Paraquat injection. After injection, individuals from different colony sources were kept in common social conditions to monitor their survival compared to saline-injected controls. Oxidative stress was measured in susceptible and resistant individuals. Paraquat drastically reduced survival but individuals varied in their resistance to treatment within and among colony sources. Longer-lived individuals exhibited higher levels of lipid peroxidation than individuals dying early. In contrast, the level of protein carbonylation was not significantly different between the two groups. This first study of oxidative stress in male honey bees suggests that survival of an acute oxidative stressor is due to tolerance, not prevention or repair, of oxidative damage to lipids. It also demonstrates colony differences in oxidative stress resistance that might be useful for breeding stress-resistant honey bees. PMID:27422326

  20. A study of the damage tolerance enhancement of carbon/epoxy laminates by utilizing an outer lamina of ultra high molecular weight polyethylene

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.; Lance, David G.

    1991-01-01

    The damage tolerance of carbon/epoxy was examined when an outer layer of ultra high molecular weight polyethylene (Spectra) material was utilized on the specimen. Four types of 16 ply quasi-isotropic panels, (0,+45,90,-45)s2 were tested. The first contained no Spectra, while the others had one lamina of Spectra placed on either the top (impacted side), bottom or both surfaces of the composite plate. A range of impact energies up to approximately 8.5 Joules (6.3 ft-lbs) was used to inflict damage upon these specimens. Glass/Phenolic honeycomb beams with a core density of 314 N/m3 (2.0 lb/ft3) and 8 ply quasi-isotropic facesheets were also tested for compression-after-impact strength with and without Spectra at impact energies of 1,2,3 and 4 Joules (.74, 1.47, 2.21 and 2.95 ft-lbs). It was observed that the composite plates had little change in damage tolerance due to the Spectra, while the honeycomb panels demonstrated a slight increase in damage tolerance when Spectra was added, the damage tolerance level being more improved at higher impact energies.

  1. Correlating grain size to radiation damage tolerance of tungsten materials exposed to relevant fusion conditions

    NASA Astrophysics Data System (ADS)

    Gonderman, Sean

    Tungsten remains a leading candidate for plasma facing component (PFC) in future fusion devices. This is in large part due to its strong thermal and mechanical properties. The ITER project has already chosen to use an all tungsten divertor. Despite having a high melting temperature and low erosion rate, tungsten faces a large variety of issues when subject to fusion like conditions. These include embrittlement, melting, and extreme morphology change (growth of fuzz nanostructure). The work presented here investigates mechanisms that drive surface morphology change in tungsten materials exposed to fusion relevant plasmas. Specifically, tungsten materials of different grain sizes are studied to elucidate the impact of grain boundaries on irradiation damage. Exposure of ultrafine (< 500 nm) and nanocrystalline (< 100 nm) grain materials are exposed to high flux helium plasmas at the Dutch Institute for Fundamental Energy Research (DIFFER) in the Netherlands. These samples are then compared to large grain (1-5 microns) tungsten materials exposed to similar conditions at DIFFER or tungsten samples from other published studies. After exposing the ultrafine grain materials to a variety of helium plasmas to different fluences between 1 x 1023 - 1 x 1027 ions-m-2, temperatures between 600-1500 °C, and ion energies between 25-70 eV, it is observed that ultrafine grained tungsten samples develop fuzz at an order of magnitude larger fluence when compared to large grained tungsten. These observations suggest that grain boundaries play a role in dictating damage accumulation and damage rate caused by ion bombardment of tungsten surfaces. These experiments are complemented by In-situ TEM analysis during 8 keV Helium irradiation of ultrafine tungsten samples to see damage propagation in different sized grains in real time. The in-situ TEM work was completed in a JEOL JEM-2000FX TEM at the Microscope and Ion Accelerator for Materials Investigation (MIAMI) facility at the

  2. Fatigue and damage tolerance of Y-TZP ceramics in layered biomechanical systems.

    PubMed

    Zhang, Yu; Pajares, Antonia; Lawn, Brian R

    2004-10-15

    The fatigue properties of fine-grain Y-TZP in cyclic flexural testing are studied. Comparative tests on a coarser-grain alumina provide a baseline control. A bilayer configuration with ceramic plates bonded to a compliant polymeric substrate and loaded with concentrated forces at the top surfaces, simulating basic layer structures in dental crowns and hip replacement prostheses, is used as a basic test specimen. Critical times to initiate radial crack failure at the ceramic undersurfaces at prescribed maximum surface loads are measured for Y-TZP with as-polished surfaces, mechanically predamaged undersurfaces, and after a thermal aging treatment. No differences in critical failure conditions are observed between monotonic and cyclic loading on as-polished surfaces, or between as-polished and mechanically damaged surfaces in monotonic loading, consistent with fatigue controlled by slow crack growth. However, the data for mechanically damaged and aged specimens show substantial declines in sustainable stresses and times to failure in cyclic loading, indicating an augmenting role of mechanical and thermal processes in certain instances. In all cases, however, the sustainable stresses in the Y-TZP remain higher than that of the alumina, suggesting that with proper measures to avoid inherent structural instabilities, Y-TZP could provide superior performance in biomechanical applications.

  3. Extreme tolerance and developmental buffering of UV-C induced DNA damage in embryos of the annual killifish Austrofundulus limnaeus.

    PubMed

    Wagner, Josiah T; Podrabsky, Jason E

    2015-01-01

    Free-living aquatic embryos are often at risk of exposure to ultraviolet radiation (UV-R). Successful completion of embryonic development depends on efficient removal of DNA lesions, and thus many aquatic embryos have mechanisms to reverse DNA lesions induced by UV-R. However, little is known of how embryos that are able to enter embryonic dormancy may respond to UV-R exposure and subsequent DNA damage. Embryos of the annual killifish Austrofundulus limnaeus are unique among vertebrates because their normal embryonic development includes (1) a complete dispersion of embryonic blastomeres prior to formation of the definitive embryonic axis, and (2) entry into a state of metabolic depression and developmental arrest termed diapause. Here, we show that developing and diapausing embryos of A. limnaeus have exceptional tolerance of UV-C radiation and can successfully complete embryonic development after receiving substantial doses of UV-C, especially if allowed to recover in full-spectrum light. Recovery in full-spectrum light permits efficient removal of the most common type of DNA lesion induced by UV-R: cyclobutane pyrimidine dimers. Interestingly, whole-mount embryo TUNEL assays suggest that apoptosis may not be a major contributor to cell death in embryos UV-C irradiated during dispersion/reaggregation or diapause. We also observed embryo mortality to be significantly delayed by several weeks in diapausing embryos irradiated and allowed to recover in the dark. These atypical responses to UV-R induced DNA damage may be due to the unique annual killifish life history and provide insight into DNA damage repair and recognition mechanisms during embryonic dormancy.

  4. Elimination of damaged mitochondria through mitophagy reduces mitochondrial oxidative stress and increases tolerance to trichothecenes

    PubMed Central

    Bin-Umer, Mohamed Anwar; McLaughlin, John E.; Butterly, Matthew S.; McCormick, Susan; Tumer, Nilgun E.

    2014-01-01

    Trichothecene mycotoxins are natural contaminants of small grain cereals and are encountered in the environment, posing a worldwide threat to human and animal health. Their mechanism of toxicity is poorly understood, and little is known about cellular protection mechanisms against trichothecenes. We previously identified inhibition of mitochondrial protein synthesis as a novel mechanism for trichothecene-induced cell death. To identify cellular functions involved in trichothecene resistance, we screened the Saccharomyces cerevisiae deletion library for increased sensitivity to nonlethal concentrations of trichothecin (Tcin) and identified 121 strains exhibiting higher sensitivity than the parental strain. The largest group of sensitive strains had significantly higher reactive oxygen species (ROS) levels relative to the parental strain. A dose-dependent increase in ROS levels was observed in the parental strain treated with different trichothecenes, but not in a petite version of the parental strain or in the presence of a mitochondrial membrane uncoupler, indicating that mitochondria are the main site of ROS production due to toxin exposure. Cytotoxicity of trichothecenes was alleviated after treatment of the parental strain and highly sensitive mutants with antioxidants, suggesting that oxidative stress contributes to trichothecene sensitivity. Cotreatment with rapamycin and trichothecenes reduced ROS levels and cytotoxicity in the parental strain relative to the trichothecene treatment alone, but not in mitophagy deficient mutants, suggesting that elimination of trichothecene-damaged mitochondria by mitophagy improves cell survival. These results reveal that increased mitophagy is a cellular protection mechanism against trichothecene-induced mitochondrial oxidative stress and a potential target for trichothecene resistance. PMID:25071194

  5. Novel high damage-tolerant, wear resistant MoSi2-based nanocomposite coatings

    NASA Astrophysics Data System (ADS)

    Xu, Jiang; Li, Zhengyang; Xie, Zong-Han; Munroe, Paul; Lu, Xiao Lin; Lan, Xiu Feng

    2013-04-01

    In this study, novel MoSi2-based nanocomposite coatings were deposited on Ti-6Al-4V substrates by a two-step process involving firstly, deposition of MoSi2-based coatings, using a double cathode glow discharge process and, secondly, plasma nitridation of the as-deposited coatings. The aim of this latter step is to introduce nitrogen into the coating and promote the formation of amorphous silicon nitride. The resulting coatings were characterized by X-ray diffraction (XRD), X-ray photoelectron spectroscope (XPS), transmission electron microscopy (TEM) and scanning electron microscopy (SEM). It was found that the nanocomposite coatings were composed of nanocrystallite Mo5Si3 and MoSi2 grains embedded in an amorphous Si3N4 matrix. The mechanical properties and damage resistance of the coatings were evaluated by both Vickers indentation and nanoindentation techniques. Dry sliding wear tests were performed using a ball-on-disc type tribometer, in which the coated samples were tested against a ZrO2 ceramic ball at normal loads of 2.8 and 4.3 N under ambient conditions. Compared with the monolithic MoSi2 nanocrystalline coating, the specific wear rates of the nanocomposite coatings decreased by an order of magnitude. The specific wear rate was further improved by about 20% through the addition of Al, which was attributed to an optimum combination of mechanical properties.

  6. Recent Developments and Challenges Implementing New and Improved Stress Intensity Factor (K) Solutions in NASGRO for Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Cardinal, Joseph W.; McClung, R. Craig; Lee, Yi-Der; Guo, Yajun; Beek, Joachim M.

    2014-01-01

    Fatigue crack growth analysis software has been available to damage tolerance analysts for many years in either commercial products or via proprietary in-house codes. The NASGRO software has been publicly available since the mid-80s (known as NASA/FLAGRO up to 1999) and since 2000 has been sustained and further developed by a collaborative effort between Southwest Research Institute® (SwRI®), the NASA Johnson Space Center (JSC), and the members of the NASGRO Industrial Consortium. Since the stress intensity factor (K) is the foundation of fracture mechanics and damage tolerance analysis of aircraft structures, a significant focus of development efforts in the past fifteen years has been geared towards enhancing legacy K solutions and developing new and efficient numerical K solutions that can handle the complicated stress gradients computed by today’s analysts using detailed finite element models of fatigue critical locations. This paper provides an overview of K solutions that have been recently implemented or improved for the analysis of geometries such as two unequal through cracks at a hole and two unequal corner cracks at a hole, as well as state-of-the-art weight function models capable of computing K in the presence of univariant and/or bivariant stress gradients and complicated residual stress distributions. Some historical background is provided to review how common K solutions have evolved over the years, including selective examples from the literature and from new research. Challenges and progress in rectifying discrepancies between older legacy solutions and newer models are reviewed as well as approaches and challenges for verification and validation of K solutions. Finally, a summary of current challenges and future research and development needs is presented. A key theme throughout the presentation of this paper will be how members of the aerospace industry have collaborated with software developers to develop a practical analysis tool that is

  7. Fatigue and Damage Tolerance Analysis of a Hybrid Composite Tapered Flexbeam

    NASA Technical Reports Server (NTRS)

    Murri, Gretchen B.; Schaff, Jeffrey R.; Dobyns, Al

    2001-01-01

    The behavior of nonlinear tapered composite flexbeams under combined axial tension and cyclic bending loading was studied using coupon test specimens and finite element (FE) analyses. The flexbeams used a hybrid material system of graphite/epoxy and glass/epoxy and had internal dropped plies, dropped in an overlapping stepwise pattern. Two material configurations, differing only in the use of glass or graphite plies in the continuous plies near the midplane, were studied. Test specimens were cut from a full-size helicopter tail-rotor flexbeam and were tested in a hydraulic load frame under combined constant axialtension load and transverse cyclic bending loads. The first determination damage observed in the specimens occurred at the area around the tip of the outermost ply-drop group in the tapered region of the flexbeam, near the thick end. Delaminations grew slowly and stably, toward the thick end of the flexbeam, at the interfaces above and below the dropped-ply region. A 2D finite element model of the flexbeam was developed. The model was analyzed using a geometrically non-linear analysis with both the ANSYS and ABAQUS FE codes. The global responses of each analysis agreed well with the test results. The ANSYS model was used to calculate strain energy release rates (G) for delaminations initiating at two different ply-ending locations. The results showed that delaminations were more inclined to grow at the locations where they were observed in the test specimens. Both ANSYS and ABAQUS were used to calculate G values associated with delamination initiating at the observed location but growing in different interfaces, either above or below the ply-ending group toward the thick end, or toward the thin end from the tip of the resin pocket. The different analysis codes generated the same trends and comparable peak values, within 5-11 % for each delamination path. Both codes showed that delamination toward the thick region was largely mode II, and toward the thin

  8. Rad5 Template Switch Pathway of DNA Damage Tolerance Determines Synergism between Cisplatin and NSC109268 in Saccharomyces cerevisiae

    PubMed Central

    Jain, Dilip; Siede, Wolfram

    2013-01-01

    The success of cisplatin (CP) based therapy is often hindered by acquisition of CP resistance. We isolated NSC109268 as a compound altering cellular sensitivity to DNA damaging agents. Previous investigation revealed an enhancement of CP sensitivity by NSC109268 in wild-type Saccharomyces cerevisiae and CP-sensitive and -resistant cancer cell lines that correlated with a slower S phase traversal. Here, we extended these studies to determine the target pathway(s) of NSC109268 in mediating CP sensitization, using yeast as a model. We reasoned that mutants defective in the relevant target of NSC109268 should be hypersensitive to CP and the sensitization effect by NSC109268 should be absent or strongly reduced. A survey of various yeast deletion mutants converged on the Rad5 pathway of DNA damage tolerance by template switching as the likely target pathway of NSC109268 in mediating cellular sensitization to CP. Additionally, cell cycle delays following CP treatment were not synergistically influenced by NSC109268 in the CP hypersensitive rad5Δ mutant. The involvement of the known inhibitory activities of NSC109268 on 20S proteasome and phosphatases 2Cα and 2A was tested. In the CP hypersensitive ptc2Δptc3Δpph3Δ yeast strain, deficient for 2C and 2A-type phosphatases, cellular sensitization to CP by NSC109268 was greatly reduced. It is therefore suggested that NSC109268 affects CP sensitivity by inhibiting the activity of unknown protein(s) whose dephosphorylation is required for the template switch pathway. PMID:24130896

  9. Damage tolerance of pressurized graphite/epoxy tape cylinders under uniaxial and biaxial loading. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Priest, Stacy Marie

    1993-01-01

    The damage tolerance behavior of internally pressurized, axially slit, graphite/epoxy tape cylinders was investigated. Specifically, the effects of axial stress, structural anisotropy, and subcritical damage were considered. In addition, the limitations of a methodology which uses coupon fracture data to predict cylinder failure were explored. This predictive methodology was previously shown to be valid for quasi-isotropic fabric and tape cylinders but invalid for structurally anisotropic (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders. The effects of axial stress and structural anisotropy were assessed by testing tape cylinders with (90/0/+/-45)(sub s), (+/-45/90)(sub s), and (+/-45/0)(sub s) layups in a uniaxial test apparatus, specially designed and built for this work, and comparing the results to previous tests conducted in biaxial loading. Structural anisotropy effects were also investigated by testing cylinders with the quasi-isotropic (0/+/-45/90)(sub s) layup which is a stacking sequence variation of the previously tested (90/0/+/-45)(sub s) layup with higher D(sub 16) and D(sub 26) terms but comparable D(sub 16) and D(sub 26) to D(sub 11) ratios. All cylinders tested and used for comparison are made from AS4/3501-6 graphite/epoxy tape and have a diameter of 305 mm. Cylinder slit lengths range from 12.7 to 50.8 mm. Failure pressures are lower for the uniaxially loaded cylinders in all cases. The smallest percent failure pressure decreases are observed for the (+/-45/90)(sub s) cylinders, while the greatest such decreases are observed for the (+/-45/0)(sub s) cylinders. The relative effects of the axial stress on the cylinder failure pressures do not correlate with the degree of structural coupling. The predictive methodology is not applicable for uniaxially loaded (+/-45/90)(sub s) and (+/-45/0)(sub s) cylinders, may be applicable for uniaxially loaded (90/0/+/-45)(sub s) cylinders, and is applicable for the biaxially loaded (90/0/+/-45)(sub s) and (0

  10. Essential Roles of the Smc5/6 Complex in Replication through Natural Pausing Sites and Endogenous DNA Damage Tolerance

    PubMed Central

    Menolfi, Demis; Delamarre, Axel; Lengronne, Armelle; Pasero, Philippe; Branzei, Dana

    2015-01-01

    Summary The essential functions of the conserved Smc5/6 complex remain elusive. To uncover its roles in genome maintenance, we established Saccharomyces cerevisiae cell-cycle-regulated alleles that enable restriction of Smc5/6 components to S or G2/M. Unexpectedly, the essential functions of Smc5/6 segregated fully and selectively to G2/M. Genetic screens that became possible with generated alleles identified processes that crucially rely on Smc5/6 specifically in G2/M: metabolism of DNA recombination structures triggered by endogenous replication stress, and replication through natural pausing sites located in late-replicating regions. In the first process, Smc5/6 modulates remodeling of recombination intermediates, cooperating with dissolution activities. In the second, Smc5/6 prevents chromosome fragility and toxic recombination instigated by prolonged pausing and the fork protection complex, Tof1-Csm3. Our results thus dissect Smc5/6 essential roles and reveal that combined defects in DNA damage tolerance and pausing site-replication cause recombination-mediated DNA lesions, which we propose to drive developmental and cancer-prone disorders. PMID:26698660

  11. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    SciTech Connect

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.

  12. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    DOE PAGESBeta

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-12-09

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ~1 GPa, excellent ductility (~60–70%) and exceptional fracture toughness (KJIc>200M Pa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening andmore » ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. In conclusion, we further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip.« less

  13. Real-time immune cell interactions in target tissue during autoimmune-induced damage and graft tolerance.

    PubMed

    Miska, Jason; Abdulreda, Midhat H; Devarajan, Priyadharshini; Lui, Jen Bon; Suzuki, Jun; Pileggi, Antonello; Berggren, Per-Olof; Chen, Zhibin

    2014-03-10

    Real-time imaging studies are reshaping immunological paradigms, but a visual framework is lacking for self-antigen-specific T cells at the effector phase in target tissues. To address this issue, we conducted intravital, longitudinal imaging analyses of cellular behavior in nonlymphoid target tissues to illustrate some key aspects of T cell biology. We used mouse models of T cell-mediated damage and protection of pancreatic islet grafts. Both CD4(+) and CD8(+) effector T (Teff) lymphocytes directly engaged target cells. Strikingly, juxtaposed β cells lacking specific antigens were not subject to bystander destruction but grew substantially in days, likely by replication. In target tissue, Foxp3(+) regulatory T (Treg) cells persistently contacted Teff cells with or without involvement of CD11c(+) dendritic cells, an observation conciliating with the in vitro "trademark" of Treg function, contact-dependent suppression. This study illustrates tolerance induction by contact-based immune cell interaction in target tissues and highlights potentials of tissue regeneration under antigenic incognito in inflammatory settings.

  14. Nanoscale origins of the damage tolerance of the high-entropy alloy CrMnFeCoNi

    PubMed Central

    Zhang, ZiJiao; Mao, M. M.; Wang, Jiangwei; Gludovatz, Bernd; Zhang, Ze; Mao, Scott X.; George, Easo P.; Yu, Qian; Ritchie, Robert O.

    2015-01-01

    Damage tolerance can be an elusive characteristic of structural materials requiring both high strength and ductility, properties that are often mutually exclusive. High-entropy alloys are of interest in this regard. Specifically, the single-phase CrMnFeCoNi alloy displays tensile strength levels of ∼1 GPa, excellent ductility (∼60–70%) and exceptional fracture toughness (KJIc>200 MPa√m). Here through the use of in situ straining in an aberration-corrected transmission electron microscope, we report on the salient atomistic to micro-scale mechanisms underlying the origin of these properties. We identify a synergy of multiple deformation mechanisms, rarely achieved in metallic alloys, which generates high strength, work hardening and ductility, including the easy motion of Shockley partials, their interactions to form stacking-fault parallelepipeds, and arrest at planar slip bands of undissociated dislocations. We further show that crack propagation is impeded by twinned, nanoscale bridges that form between the near-tip crack faces and delay fracture by shielding the crack tip. PMID:26647978

  15. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  16. Analysis of the Static and Fatigue Strenght of a Damage Tolerant 3D-Reinforced Joining Technology on Composite Single Lap Joints

    NASA Astrophysics Data System (ADS)

    Nogueira, A. C.; Drechsler, K.; Hombergsmeier, E.

    2012-07-01

    The increasing usage of carbon fiber reinforced plastics (CFRP) in aerospace together with the constant drive for fuel efficiency and lightweight design have imposed new challenges in next generation structural assemblies and load transfer efficient joining methods. To address this issue, an innovative technology, denominated Redundant High Efficiency Assembly (RHEA) joints, is introduced as a high-performance lightweight joint that combines efficient load transfer with good damage tolerance. A review of the ongoing research involving the RHEA joint technology, its through-thickness reinforcement concept and the results of quasi-static and fatigue tensile investigations of single lap shear specimens are exposed and discussed. Improvements in ultimate static load, maximum joint deformation, damage tolerance and fatigue life are encountered when comparing the performance of the RHEA lap shear joints to co-bonded reference specimens.

  17. Probabilistic fatigue methodology for six nines reliability

    NASA Technical Reports Server (NTRS)

    Everett, R. A., Jr.; Bartlett, F. D., Jr.; Elber, Wolf

    1990-01-01

    Fleet readiness and flight safety strongly depend on the degree of reliability that can be designed into rotorcraft flight critical components. The current U.S. Army fatigue life specification for new rotorcraft is the so-called six nines reliability, or a probability of failure of one in a million. The progress of a round robin which was established by the American Helicopter Society (AHS) Subcommittee for Fatigue and Damage Tolerance is reviewed to investigate reliability-based fatigue methodology. The participants in this cooperative effort are in the U.S. Army Aviation Systems Command (AVSCOM) and the rotorcraft industry. One phase of the joint activity examined fatigue reliability under uniquely defined conditions for which only one answer was correct. The other phases were set up to learn how the different industry methods in defining fatigue strength affected the mean fatigue life and reliability calculations. Hence, constant amplitude and spectrum fatigue test data were provided so that each participant could perform their standard fatigue life analysis. As a result of this round robin, the probabilistic logic which includes both fatigue strength and spectrum loading variability in developing a consistant reliability analysis was established. In this first study, the reliability analysis was limited to the linear cumulative damage approach. However, it is expected that superior fatigue life prediction methods will ultimately be developed through this open AHS forum. To that end, these preliminary results were useful in identifying some topics for additional study.

  18. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and

  19. A Damage Tolerance Comparison of Composite Hat-Stiffened and Honeycomb Sandwich Structure for Launch Vehicle Interstage Applications

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.

    2011-01-01

    In this study, a direct comparison of the compression-after-impact (CAI) strength of impact-damaged, hat-stiffened and honeycomb sandwich structure for launch vehicle use was made. The specimens used consisted of small substructure designed to carry a line load of approx..3,000 lb/in. Damage was inflicted upon the specimens via drop weight impact. Infrared thermography was used to examine the extent of planar damage in the specimens. The specimens were prepared for compression testing to obtain residual compression strength versus damage severity curves. Results show that when weight of the structure is factored in, both types of structure had about the same CAI strength for a given damage level. The main difference was that the hat-stiffened specimens exhibited a multiphase failure whereas the honeycomb sandwich structure failed catastrophically.

  20. The Effects of Foam Thermal Protection System on the Damage Tolerance Characteristics of Composite Sandwich Structures for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Nettles, A. T.; Hodge, A. J.; Jackson, J. R.

    2011-01-01

    For any structure composed of laminated composite materials, impact damage is one of the greatest risks and therefore most widely tested responses. Typically, impact damage testing and analysis assumes that a solid object comes into contact with the bare surface of the laminate (the outer ply). However, most launch vehicle structures will have a thermal protection system (TPS) covering the structure for the majority of its life. Thus, the impact response of the material with the TPS covering is the impact scenario of interest. In this study, laminates representative of the composite interstage structure for the Ares I launch vehicle were impact tested with and without the planned TPS covering, which consists of polyurethane foam. Response variables examined include maximum load of impact, damage size as detected by nondestructive evaluation techniques, and damage morphology and compression after impact strength. Results show that there is little difference between TPS covered and bare specimens, except the residual strength data is higher for TPS covered specimens.

  1. The role of quasi-plasticity in the extreme contact damage tolerance of the stomatopod dactyl club

    NASA Astrophysics Data System (ADS)

    Amini, Shahrouz; Tadayon, Maryam; Idapalapati, Sridhar; Miserez, Ali

    2015-09-01

    The structure of the stomatopod dactyl club--an ultrafast, hammer-like device used by the animal to shatter hard seashells--offers inspiration for impact-tolerant ceramics. Here, we present the micromechanical principles and related micromechanisms of deformation that impart the club with high impact tolerance. By using depth-sensing nanoindentation with spherical and sharp contact tips in combination with post-indentation residual stress mapping by Raman microspectroscopy, we show that the impact surface region of the dactyl club exhibits a quasi-plastic contact response associated with the interfacial sliding and rotation of fluorapatite nanorods, endowing the club with localized yielding. We also show that the subsurface layers exhibit strain hardening by microchannel densification, which provides additional dissipation of impact energy. Our findings suggest that the club’s macroscopic size is below the critical size above which Hertzian brittle cracks are nucleated.

  2. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  3. The micro-mechanics of strength, durability and damage tolerance in composites: new insights from high resolution computed tomography

    NASA Astrophysics Data System (ADS)

    Spearing, S. Mark; Sinclair, Ian

    2016-07-01

    Recent work, led by the authors, on impact damage resistance, particle toughening and tensile fibre failure is reviewed in order to illustrate the use of high-resolution X-ray tomography to observe and quantify damage mechanisms in carbon fibre composite laminates. Using synchrotron and micro-focus X-ray sources resolutions of less than 1 μm have been routinely achieved. This enables individual broken fibres and the micromechanisms of particle toughening to be observed and quantified. The data for fibre failure, cluster formation and overall tensile strength are compared with model predictions. This allows strategies for future model development to be identified. The overall implications for using such high-resolution 3-D measurements to inform a “data-rich mechanics” approach to materials evaluation and modeling is discussed.

  4. Differential roles of proteins involved in migration of Holliday junctions on recombination and tolerance to DNA damaging agents in Rhizobium etli.

    PubMed

    Martínez-Salazar, Jaime M; Zuñiga-Castillo, Jacobo; Romero, David

    2009-03-01

    The recombination genes involved in Holliday junction migration (ruvB, recG, radA) and heteroduplex editing (mutS) were studied in the alpha-proteobacterium Rhizobium etli. The genes were interrupted with a loxPSp interposon and R. etli mutants, either single or in combination, were constructed by marker exchange. Our results show that these systems play a differential role in sensitivity to DNA damaging agents and recombination in R. etli. RuvB appears to be the main system for tolerance toward agents instigating single- or double-strand breaks (such as UV light, methyl methanesulphonate and nalidixic acid) while the RecG and RadA systems play minor roles in tolerance to these agents. Using five different recombination assays, we have found that a ruvB null mutant showed a notable reduction in recombination proficiency, while a radA mutant was only weakly affected. A null mutation in recG had the opposite effect, enhancing recombination in most of our assays. This effect was more clearly seen in an assay that measured recombination between divergent sequences (i.e. homeologous), but is unaffected by inactivation of mutS. These data indicate that RecG in R. etli limits intra- and intergenomic plasticity.

  5. A cell wall damage response mediated by a sensor kinase/response regulator pair enables beta-lactam tolerance.

    PubMed

    Dörr, Tobias; Alvarez, Laura; Delgado, Fernanda; Davis, Brigid M; Cava, Felipe; Waldor, Matthew K

    2016-01-12

    The bacterial cell wall is critical for maintenance of cell shape and survival. Following exposure to antibiotics that target enzymes required for cell wall synthesis, bacteria typically lyse. Although several cell envelope stress response systems have been well described, there is little knowledge of systems that modulate cell wall synthesis in response to cell wall damage, particularly in Gram-negative bacteria. Here we describe WigK/WigR, a histidine kinase/response regulator pair that enables Vibrio cholerae, the cholera pathogen, to survive exposure to antibiotics targeting cell wall synthesis in vitro and during infection. Unlike wild-type V. cholerae, mutants lacking wigR fail to recover following exposure to cell-wall-acting antibiotics, and they exhibit a drastically increased cell diameter in the absence of such antibiotics. Conversely, overexpression of wigR leads to cell slimming. Overexpression of activated WigR also results in increased expression of the full set of cell wall synthesis genes and to elevated cell wall content. WigKR-dependent expression of cell wall synthesis genes is induced by various cell-wall-acting antibiotics as well as by overexpression of an endogenous cell wall hydrolase. Thus, WigKR appears to monitor cell wall integrity and to enhance the capacity for increased cell wall production in response to damage. Taken together, these findings implicate WigKR as a regulator of cell wall synthesis that controls cell wall homeostasis in response to antibiotics and likely during normal growth as well.

  6. MELK-T1, a small-molecule inhibitor of protein kinase MELK, decreases DNA-damage tolerance in proliferating cancer cells

    PubMed Central

    Beke, Lijs; Kig, Cenk; Linders, Joannes T. M.; Boens, Shannah; Boeckx, An; van Heerde, Erika; Parade, Marc; De Bondt, An; Van den Wyngaert, Ilse; Bashir, Tarig; Ogata, Souichi; Meerpoel, Lieven; Van Eynde, Aleyde; Johnson, Christopher N.; Beullens, Monique; Brehmer, Dirk; Bollen, Mathieu

    2015-01-01

    Maternal embryonic leucine zipper kinase (MELK), a serine/threonine protein kinase, has oncogenic properties and is overexpressed in many cancer cells. The oncogenic function of MELK is attributed to its capacity to disable critical cell-cycle checkpoints and reduce replication stress. Most functional studies have relied on the use of siRNA/shRNA-mediated gene silencing. In the present study, we have explored the biological function of MELK using MELK-T1, a novel and selective small-molecule inhibitor. Strikingly, MELK-T1 triggered a rapid and proteasome-dependent degradation of the MELK protein. Treatment of MCF-7 (Michigan Cancer Foundation-7) breast adenocarcinoma cells with MELK-T1 induced the accumulation of stalled replication forks and double-strand breaks that culminated in a replicative senescence phenotype. This phenotype correlated with a rapid and long-lasting ataxia telangiectasia-mutated (ATM) activation and phosphorylation of checkpoint kinase 2 (CHK2). Furthermore, MELK-T1 induced a strong phosphorylation of p53 (cellular tumour antigen p53), a prolonged up-regulation of p21 (cyclin-dependent kinase inhibitor 1) and a down-regulation of FOXM1 (Forkhead Box M1) target genes. Our data indicate that MELK is a key stimulator of proliferation by its ability to increase the threshold for DNA-damage tolerance (DDT). Thus, targeting MELK by the inhibition of both its catalytic activity and its protein stability might sensitize tumours to DNA-damaging agents or radiation therapy by lowering the DNA-damage threshold. PMID:26431963

  7. MELK-T1, a small-molecule inhibitor of protein kinase MELK, decreases DNA-damage tolerance in proliferating cancer cells.

    PubMed

    Beke, Lijs; Kig, Cenk; Linders, Joannes T M; Boens, Shannah; Boeckx, An; van Heerde, Erika; Parade, Marc; De Bondt, An; Van den Wyngaert, Ilse; Bashir, Tarig; Ogata, Souichi; Meerpoel, Lieven; Van Eynde, Aleyde; Johnson, Christopher N; Beullens, Monique; Brehmer, Dirk; Bollen, Mathieu

    2015-01-01

    Maternal embryonic leucine zipper kinase (MELK), a serine/threonine protein kinase, has oncogenic properties and is overexpressed in many cancer cells. The oncogenic function of MELK is attributed to its capacity to disable critical cell-cycle checkpoints and reduce replication stress. Most functional studies have relied on the use of siRNA/shRNA-mediated gene silencing. In the present study, we have explored the biological function of MELK using MELK-T1, a novel and selective small-molecule inhibitor. Strikingly, MELK-T1 triggered a rapid and proteasome-dependent degradation of the MELK protein. Treatment of MCF-7 (Michigan Cancer Foundation-7) breast adenocarcinoma cells with MELK-T1 induced the accumulation of stalled replication forks and double-strand breaks that culminated in a replicative senescence phenotype. This phenotype correlated with a rapid and long-lasting ataxia telangiectasia-mutated (ATM) activation and phosphorylation of checkpoint kinase 2 (CHK2). Furthermore, MELK-T1 induced a strong phosphorylation of p53 (cellular tumour antigen p53), a prolonged up-regulation of p21 (cyclin-dependent kinase inhibitor 1) and a down-regulation of FOXM1 (Forkhead Box M1) target genes. Our data indicate that MELK is a key stimulator of proliferation by its ability to increase the threshold for DNA-damage tolerance (DDT). Thus, targeting MELK by the inhibition of both its catalytic activity and its protein stability might sensitize tumours to DNA-damaging agents or radiation therapy by lowering the DNA-damage threshold. PMID:26431963

  8. Identification of β Clamp-DNA Interaction Regions That Impair the Ability of E. coli to Tolerate Specific Classes of DNA Damage

    PubMed Central

    Nanfara, Michael T.; Babu, Vignesh M. P.; Ghazy, Mohamed A.; Sutton, Mark D.

    2016-01-01

    The E. coli dnaN-encoded β sliding clamp protein plays a pivotal role in managing the actions on DNA of the 5 bacterial DNA polymerases, proteins involved in mismatch repair, as well as several additional proteins involved in DNA replication. Results of in vitro experiments indicate that the loading of β clamp onto DNA relies on both the DnaX clamp loader complex as well as several discrete sliding clamp-DNA interactions. However, the importance of these DNA interactions to E. coli viability, as well as the ability of the β clamp to support the actions of its numerous partner proteins, have not yet been examined. To determine the contribution of β clamp-DNA interactions to the ability of E. coli to cope with different classes of DNA damage, we used alanine scanning to mutate 22 separate residues mapping to 3 distinct β clamp surfaces known or nearby those known to contact the DNA template, including residues P20-L27 (referred to here as loop I), H148-Y154 (loop II) and 7 different residues lining the central pore of the β clamp through which the DNA template threads. Twenty of these 22 dnaN mutants supported bacterial growth. While none of these 20 conferred sensitivity to hydrogen peroxide or ultra violet light, 12 were sensitized to NFZ, 5 were sensitized to MMS, 8 displayed modestly altered frequencies of DNA damage-induced mutagenesis, and 2 may be impaired for supporting hda function. Taken together, these results demonstrate that discrete β clamp-DNA interaction regions contribute to the ability of E. coli to tolerate specific classes of DNA damage. PMID:27685804

  9. A cell wall damage response mediated by a sensor kinase/response regulator pair enables beta-lactam tolerance

    PubMed Central

    Dörr, Tobias; Alvarez, Laura; Delgado, Fernanda; Davis, Brigid M.; Cava, Felipe; Waldor, Matthew K.

    2016-01-01

    The bacterial cell wall is critical for maintenance of cell shape and survival. Following exposure to antibiotics that target enzymes required for cell wall synthesis, bacteria typically lyse. Although several cell envelope stress response systems have been well described, there is little knowledge of systems that modulate cell wall synthesis in response to cell wall damage, particularly in Gram-negative bacteria. Here we describe WigK/WigR, a histidine kinase/response regulator pair that enables Vibrio cholerae, the cholera pathogen, to survive exposure to antibiotics targeting cell wall synthesis in vitro and during infection. Unlike wild-type V. cholerae, mutants lacking wigR fail to recover following exposure to cell-wall–acting antibiotics, and they exhibit a drastically increased cell diameter in the absence of such antibiotics. Conversely, overexpression of wigR leads to cell slimming. Overexpression of activated WigR also results in increased expression of the full set of cell wall synthesis genes and to elevated cell wall content. WigKR-dependent expression of cell wall synthesis genes is induced by various cell-wall–acting antibiotics as well as by overexpression of an endogenous cell wall hydrolase. Thus, WigKR appears to monitor cell wall integrity and to enhance the capacity for increased cell wall production in response to damage. Taken together, these findings implicate WigKR as a regulator of cell wall synthesis that controls cell wall homeostasis in response to antibiotics and likely during normal growth as well. PMID:26712007

  10. Probabilistic record linkage.

    PubMed

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  11. Umbilical cord blood-derived stem cells improve heat tolerance and hypothalamic damage in heat stressed mice.

    PubMed

    Tseng, Ling-Shu; Chen, Sheng-Hsien; Lin, Mao-Tsun; Lin, Ying-Chu

    2014-01-01

    Heatstroke is characterized by excessive hyperthermia associated with systemic inflammatory responses, which leads to multiple organ failure, in which brain disorders predominate. This definition can be almost fulfilled by a mouse model of heatstroke used in the present study. Unanesthetized mice were exposed to whole body heating (41.2°C for 1 hour) and then returned to room temperature (26°C) for recovery. Immediately after termination of whole body heating, heated mice displayed excessive hyperthermia (body core temperature ~42.5°C). Four hours after termination of heat stress, heated mice displayed (i) systemic inflammation; (ii) ischemic, hypoxic, and oxidative damage to the hypothalamus; (iii) hypothalamo-pituitary-adrenocortical axis impairment (reflected by plasma levels of both adrenocorticotrophic-hormone and corticosterone); (iv) decreased fractional survival; and (v) thermoregulatory deficits (e.g., they became hypothermia when they were exposed to room temperature). These heatstroke reactions can be significantly attenuated by human umbilical cord blood-derived CD34(+) cells therapy. Our data suggest that human umbilical cord blood-derived stem cells therapy may improve outcomes of heatstroke in mice by reducing systemic inflammation as well as hypothalamo-pituitary-adrenocortical axis impairment.

  12. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  13. Coordinated Changes in Antioxidative Enzymes Protect the Photosynthetic Machinery from Salinity Induced Oxidative Damage and Confer Salt Tolerance in an Extreme Halophyte Salvadora persica L.

    PubMed Central

    Rangani, Jaykumar; Parida, Asish K.; Panda, Ashok; Kumari, Asha

    2016-01-01

    Salinity-induced modulations in growth, photosynthetic pigments, relative water content (RWC), lipid peroxidation, photosynthesis, photosystem II efficiency, and changes in activity of various antioxidative enzymes were studied in the halophyte Salvadora persica treated with various levels of salinity (0, 250, 500, 750, and 1000 mM NaCl) to obtain an insight into the salt tolerance ability of this halophyte. Both fresh and dry biomass as well as leaf area (LA) declined at all levels of salinity whereas salinity caused an increase in leaf succulence. A gradual increase was observed in the Na+ content of leaf with increasing salt concentration up to 750 mM NaCl, but at higher salt concentration (1000 mM NaCl), the Na+ content surprisingly dropped down to the level of 250 mM NaCl. The chlorophyll and carotenoid contents of the leaf remained unaffected by salinity. The photosynthetic rate (PN), stomatal conductance (gs), the transpiration rate (E), quantum yield of PSII (ΦPSII), photochemical quenching (qP), and electron transport rate remained unchanged at low salinity (250 to 500 mM NaCl) whereas, significant reduction in these parameters were observed at high salinity (750 to 1000 mM NaCl). The RWC% and water use efficiency (WUE) of leaf remained unaffected by salinity. The salinity had no effect on maximum quantum efficiency of PS II (Fv/Fm) which indicates that PS II is not perturbed by salinity-induced oxidative damage. Analysis of the isoforms of antioxidative enzymes revealed that the leaves of S. persica have two isoforms each of Mn-SOD and Fe-SOD and one isoform of Cu-Zn SOD, three isoforms of POX, two isoforms of APX and one isoform of CAT. There was differential responses in activity and expression of different isoforms of various antioxidative enzymes. The malondialdehyde (MDA) content (a product of lipid peroxidation) of leaf remained unchanged in S. persica treated with various levels of salinity. Our results suggest that the absence of pigment

  14. Review of seismic probabilistic risk assessment and the use of sensitivity analysis

    SciTech Connect

    Shiu, K.K.; Reed, J.W.; McCann, M.W. Jr.

    1985-01-01

    This paper presents results of sensitivity reviews performed to address a range of questions which arise in the context of seismic probabilistic risk assessment (PRA). These questions are the subject of this paper. A seismic PRA involves evalution of seismic hazard, component fragilities, and system responses. They are combined in an integrated analysis to obtain various risk measures, such as frequency of plant damage states. Calculation of these measures depends on combination of non-linear functions based on a number of parameters and assumptions used in the quantification process. Therefore it is often difficult to examine seismic PRA results and derive useful insights from them if detailed sensitivity studies are absent. This has been exempified in the process of trying to understand the role of low acceleration earthquakes in overall seismic risk. It is useful to understand, within a probabilistic framework, what uncertainties in the physical properties of the plant can be tolerated, if the risk from a safe shutdown earthquake is to be considered negligible. Seismic event trees and fault trees were developed to model the difference system and plant accident sequences. Hazard curves which represent various sites on the east coast were obtained; alternate structure and equipment fragility data were postulated. Various combinations of hazard and fragility data were analyzed. In addition, system modeling was perturbed to examine the impact upon the final results. Orders of magnitude variation were observed in the plant damage state frequency among the different cases. 7 refs.

  15. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  16. Probabilistic protocols in quantum information science: Use and abuse

    NASA Astrophysics Data System (ADS)

    Caves, Carlton

    2014-03-01

    Protocols in quantum information science often succeed with less than unit probability, but nonetheless perform useful tasks because success occurs often enough to make tolerable the overhead from having to perform the protocol several times. Any probabilistic protocol must be analyzed from the perspective of the resources required to make the protocol succeed. I present results from analyses of two probabilistic protocols: (i) nondeterministic (or immaculate) linear amplification, in which an input coherent state is amplified some of the time to a larger-amplitude coherent state, and (ii) probabilistic quantum metrology, in which one attempts to improve estimation of a parameter (or parameters) by post-selecting on a particular outcome. The analysis indicates that there is little to be gained from probabilistic protocols in these two situations.

  17. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  18. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  19. Damage tolerant light absorbing material

    DOEpatents

    Lauf, Robert J.; Hamby, Jr., Clyde; Akerman, M. Alfred; Seals, Roland D.

    1993-01-01

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000.degree. C. to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm.sup.3.

  20. Damage tolerant light absorbing material

    DOEpatents

    Lauf, R.J.; Hamby, C. Jr.; Akerman, M.A.; Seals, R.D.

    1993-09-07

    A light absorbing article comprised of a composite of carbon-bonded carbon fibers, is prepared by: blending carbon fibers with a carbonizable organic powder to form a mixture; dispersing the mixture into an aqueous slurry; vacuum molding the aqueous slurry to form a green article; drying and curing the green article to form a cured article; and, carbonizing the cured article at a temperature of at least about 1000 C to form a carbon-bonded carbon fiber light absorbing composite article having a bulk density less than 1 g/cm[sup 3]. 9 figures.

  1. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  2. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  3. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  4. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  5. Probabilistic composite analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.

    1991-01-01

    Formal procedures are described which are used to computationally simulate the probabilistic behavior of composite structures. The computational simulation starts with the uncertainties associated with all aspects of a composite structure (constituents, fabrication, assembling, etc.) and encompasses all aspects of composite behavior (micromechanics, macromechanics, combined stress failure, laminate theory, structural response, and tailoring) optimization. Typical cases are included to illustrate the formal procedure for computational simulation. The collective results of the sample cases demonstrate that uncertainties in composite behavior and structural response can be probabilistically quantified.

  6. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  7. The desert moss Pterygoneurum lamellatum (Pottiaceae) exhibits an inducible ecological strategy of desiccation tolerance: effects of rate of drying on shoot damage and regeneration

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Premise of the study: Bryophytes are regarded as a clade incorporating constitutive desiccation tolerance, especially terrestrial species. Here we test the hypothesis that the opposing ecological strategy of desiccation tolerance, inducibility, is present in a desert moss, and addressed by varying r...

  8. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  9. Probabilistic Safety Assessment of Tehran Research Reactor

    SciTech Connect

    Hosseini, Seyed Mohammad Hadi; Nematollahi, Mohammad Reza; Sepanloo, Kamran

    2004-07-01

    Probabilistic Safety Assessment (PSA) application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this paper the application of the Probabilistic Safety Assessment to the Tehran Research Reactor (TRR) is presented. The level 1 PSA application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantification, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using SAPHIRE software. This Study shows that the obtained core damage frequency for Tehran Research Reactor (8.368 E-6 per year) well meets the IAEA criterion for existing nuclear power plants (1E-4). But safety improvement suggestions are offered to decrease the most probable accidents. (authors)

  10. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  11. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  12. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  13. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  14. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  15. Probabilistic river forecast methodology

    NASA Astrophysics Data System (ADS)

    Kelly, Karen Suzanne

    1997-09-01

    The National Weather Service (NWS) operates deterministic conceptual models to predict the hydrologic response of a river basin to precipitation. The output from these models are forecasted hydrographs (time series of the future river stage) at certain locations along a river. In order for the forecasts to be useful for optimal decision making, the uncertainty associated with them must be quantified. A methodology is developed for this purpose that (i) can be implemented with any deterministic hydrologic model, (ii) receives a probabilistic forecast of precipitation as input, (iii) quantifies all sources of uncertainty, (iv) operates in real-time and within computing constraints, and (v) produces probability distributions of future river stages. The Bayesian theory which supports the methodology involves transformation of a distribution of future precipitation into one of future river stage, and statistical characterization of the uncertainty in the hydrologic model. This is accomplished by decomposing total uncertainty into that associated with future precipitation and that associated with the hydrologic transformations. These are processed independently and then integrated into a predictive distribution which constitutes a probabilistic river stage forecast. A variety of models are presented for implementation of the methodology. In the most general model, a probability of exceedance associated with a given future hydrograph specified. In the simplest model, a probability of exceedance associated with a given future river stage is specified. In conjunction with the Ohio River Forecast Center of the NWS, the simplest model is used to demonstrate the feasibility of producing probabilistic river stage forecasts for a river basin located in headwaters. Previous efforts to quantify uncertainty in river forecasting have only considered selected sources of uncertainty, been specific to a particular hydrologic model, or have not obtained an entire probability

  16. Probabilistic simple splicing systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2014-06-01

    A splicing system, one of the early theoretical models for DNA computing was introduced by Head in 1987. Splicing systems are based on the splicing operation which, informally, cuts two strings of DNA molecules at the specific recognition sites and attaches the prefix of the first string to the suffix of the second string, and the prefix of the second string to the suffix of the first string, thus yielding the new strings. For a specific type of splicing systems, namely the simple splicing systems, the recognition sites are the same for both strings of DNA molecules. It is known that splicing systems with finite sets of axioms and splicing rules only generate regular languages. Hence, different types of restrictions have been considered for splicing systems in order to increase their computational power. Recently, probabilistic splicing systems have been introduced where the probabilities are initially associated with the axioms, and the probabilities of the generated strings are computed from the probabilities of the initial strings. In this paper, some properties of probabilistic simple splicing systems are investigated. We prove that probabilistic simple splicing systems can also increase the computational power of the splicing languages generated.

  17. Tolerating Zero Tolerance?

    ERIC Educational Resources Information Center

    Moore, Brian N.

    2010-01-01

    The concept of zero tolerance dates back to the mid-1990s when New Jersey was creating laws to address nuisance crimes in communities. The main goal of these neighborhood crime policies was to have zero tolerance for petty crime such as graffiti or littering so as to keep more serious crimes from occurring. Next came the war on drugs. In federal…

  18. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  19. Probabilistic Failure Assessment For Fatigue

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Ebbeler, Donald; Newlin, Laura; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue (PFAFAT) package of software utilizing probabilistic failure-assessment (PFA) methodology to model high- and low-cycle-fatigue modes of failure of structural components. Consists of nine programs. Three programs perform probabilistic fatigue analysis by means of Monte Carlo simulation. Other six used for generating random processes, characterizing fatigue-life data pertaining to materials, and processing outputs of computational simulations. Written in FORTRAN 77.

  20. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.

  1. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. PMID:26215051

  2. Probabilistic inspection strategies for minimizing service failures

    NASA Technical Reports Server (NTRS)

    Brot, Abraham

    1994-01-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  3. Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Not Available

    1988-01-01

    The purpose of Probabilistic Seismic Hazard Analysis (PSHA) is to evaluate the hazard of seismic ground motion at a site by considering all possible earthquakes in the area, estimating the associated shaking at the site, and calculating the probabilities of these occurrences. The Panel on Seismic Hazard Analysis is charged with assessment of the capabilities, limitations, and future trends of PSHA in the context of alternatives. The report identifies and discusses key issues of PSHA and is addressed to decision makers with a modest scientific and technical background and to the scientific and technical community. 37 refs., 19 figs.

  4. Probabilistic Cellular Automata

    PubMed Central

    Agapie, Alexandru; Giuclea, Marius

    2014-01-01

    Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  5. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  6. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  7. Passage Retrieval: A Probabilistic Technique.

    ERIC Educational Resources Information Center

    Melucci, Massimo

    1998-01-01

    Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…

  8. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  9. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  10. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  11. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  12. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  13. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  14. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  15. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  16. Air exposure behavior of the semiterrestrial crab Neohelice granulata allows tolerance to severe hypoxia but not prevent oxidative damage due to hypoxia-reoxygenation cycle.

    PubMed

    de Lima, Tábata Martins; Geihs, Márcio Alberto; Nery, Luiz Eduardo Maia; Maciel, Fábio Everton

    2015-11-01

    The air exposure behavior of the semi-terrestrial crab Neohelice granulata during severe hypoxia was studied. This study also verified whether this behavior mitigates possible oxidative damage, namely lipoperoxidation, caused by hypoxia and reoxygenation cycles. The lethal time for 50% of the crabs subjected to severe hypoxia (0.5 mgO2 · L(-1)) with free access to air was compared to that of crabs subjected to severe hypoxia without access to air. Crabs were placed in aquaria divided into three zones: water (when the animal was fully submersed), land (when the animal was completely emerged) and intermediate (when the animal was in contact with both environments) zones. Then the crabs were held in this condition for 270 min, and the time spent in each zone was recorded. Lipid peroxidation (LPO) damage to the walking leg muscles was determined for the following four experimental conditions: a--normoxic water with free access to air; b--hypoxic water without access to air; c--hypoxic water followed by normoxic water without air access; and d--hypoxic water with free access to air. When exposed to hypoxic water, N. granulata spent significantly more time on land, 135.3 ± 17.7 min, whereas control animals (exposed to normoxic water) spent more time submerged, 187.4 ± 20.2 min. By this behavior, N. granulata was able to maintain a 100% survival rate when exposed to severe hypoxia. However, N. granulata must still return to water after periods of air exposure (~ 14 min), causing a sequence of hypoxia/reoxygenation events. Despite increasing the survival rate, hypoxia with air access does not decrease the lipid peroxidation damage caused by the hypoxia and reoxygenation cycle experienced by these crabs.

  17. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  18. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  19. Probabilistic Prognosis of Non-Planar Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Newman, John A.; Warner, James E.; Leser, William P.; Hochhalter, Jacob D.; Yuan, Fuh-Gwo

    2016-01-01

    Quantifying the uncertainty in model parameters for the purpose of damage prognosis can be accomplished utilizing Bayesian inference and damage diagnosis data from sources such as non-destructive evaluation or structural health monitoring. The number of samples required to solve the Bayesian inverse problem through common sampling techniques (e.g., Markov chain Monte Carlo) renders high-fidelity finite element-based damage growth models unusable due to prohibitive computation times. However, these types of models are often the only option when attempting to model complex damage growth in real-world structures. Here, a recently developed high-fidelity crack growth model is used which, when compared to finite element-based modeling, has demonstrated reductions in computation times of three orders of magnitude through the use of surrogate models and machine learning. The model is flexible in that only the expensive computation of the crack driving forces is replaced by the surrogate models, leaving the remaining parameters accessible for uncertainty quantification. A probabilistic prognosis framework incorporating this model is developed and demonstrated for non-planar crack growth in a modified, edge-notched, aluminum tensile specimen. Predictions of remaining useful life are made over time for five updates of the damage diagnosis data, and prognostic metrics are utilized to evaluate the performance of the prognostic framework. Challenges specific to the probabilistic prognosis of non-planar fatigue crack growth are highlighted and discussed in the context of the experimental results.

  20. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  1. Quantifying the risks of winter damage on overwintering crops under future climates: Will low-temperature damage be more likely in warmer climates?

    NASA Astrophysics Data System (ADS)

    Vico, G.; Weih, M.

    2014-12-01

    Autumn-sown crops act as winter cover crop, reducing soil erosion and nutrient leaching, while potentially providing higher yields than spring varieties in many environments. Nevertheless, overwintering crops are exposed for longer periods to the vagaries of weather conditions. Adverse winter conditions, in particular, may negatively affect the final yield, by reducing crop survival or its vigor. The net effect of the projected shifts in climate is unclear. On the one hand, warmer temperatures may reduce the frequency of low temperatures, thereby reducing damage risk. On the other hand, warmer temperatures, by reducing plant acclimation level and the amount and duration of snow cover, may increase the likelihood of damage. Thus, warmer climates may paradoxically result in more extensive low temperature damage and reduced viability for overwintering plants. The net effect of a shift in climate is explored by means of a parsimonious probabilistic model, based on a coupled description of air temperature, snow cover, and crop tolerable temperature. Exploiting an extensive dataset of winter wheat responses to low temperature exposure, the risk of winter damage occurrence is quantified under conditions typical of northern temperate latitudes. The full spectrum of variations expected with climate change is explored, quantifying the joint effects of alterations in temperature averages and their variability as well as shifts in precipitation. The key features affecting winter wheat vulnerability to low temperature damage under future climates are singled out.

  2. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  3. 7 CFR 51.2280 - Tolerances for grade defects.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Tolerances for grade defects Total defects Serious damage Very serious damage Shell and foreign material U. S... 7 Agriculture 2 2010-01-01 2010-01-01 false Tolerances for grade defects. 51.2280 Section 51.2280... STANDARDS) United States Standards for Shelled English Walnuts (Juglans Regia) Tolerances for Grade...

  4. 7 CFR 51.2280 - Tolerances for grade defects.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Tolerances for grade defects Total defects Serious damage Very serious damage Shell and foreign material U. S... 7 Agriculture 2 2011-01-01 2011-01-01 false Tolerances for grade defects. 51.2280 Section 51.2280... STANDARDS) United States Standards for Shelled English Walnuts (Juglans Regia) Tolerances for Grade...

  5. Generalization in probabilistic RAM nets.

    PubMed

    Clarkson, T G; Guan, Y; Taylor, J G; Gorse, D

    1993-01-01

    The probabilistic RAM (pRAM) is a hardware-realizable neural device which is stochastic in operation and highly nonlinear. Even small nets of pRAMs offer high levels of functionality. The means by which a pRAM network generalizes when trained in noise is shown and the results of this behavior are described.

  6. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  7. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  8. Enhanced probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2005-06-01

    A microcell is a cell with 1-km or less radius which is suitable not only for heavily urbanized area such as a metropolitan city but also for in-building area such as offices and shopping malls. This paper deals with the microcell prediction model of propagation loss focused on in-buildng solution that is analyzed by probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. Combination of the probabilistic method is applied to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SSQC (Six-Sigma Quality Control) to get the parameters of the distribution. This probabilistic solution gives us compact measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. In addition, the optimal strategies for antenna allocation for a building can be obtained by using this model.

  9. Designing Probabilistic Tasks for Kindergartners

    ERIC Educational Resources Information Center

    Skoumpourdi, Chrysanthi; Kafoussi, Sonia; Tatsis, Konstantinos

    2009-01-01

    Recent research suggests that children could be engaged in probability tasks at an early age and task characteristics seem to play an important role in the way children perceive an activity. To this direction in the present article we investigate the role of some basic characteristics of probabilistic tasks in their design and implementation. In…

  10. DNA Damage Response

    PubMed Central

    Giglia-Mari, Giuseppina; Zotter, Angelika; Vermeulen, Wim

    2011-01-01

    Structural changes to DNA severely affect its functions, such as replication and transcription, and play a major role in age-related diseases and cancer. A complicated and entangled network of DNA damage response (DDR) mechanisms, including multiple DNA repair pathways, damage tolerance processes, and cell-cycle checkpoints safeguard genomic integrity. Like transcription and replication, DDR is a chromatin-associated process that is generally tightly controlled in time and space. As DNA damage can occur at any time on any genomic location, a specialized spatio-temporal orchestration of this defense apparatus is required. PMID:20980439

  11. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  12. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  13. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... not more than 6 pct which are damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No tolerance to reduce the required 70 pct of “light...

  14. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  15. Probabilistic fatigue methodology and wind turbine reliability

    SciTech Connect

    Lange, C.H.

    1996-05-01

    Wind turbines subjected to highly irregular loadings due to wind, gravity, and gyroscopic effects are especially vulnerable to fatigue damage. The objective of this study is to develop and illustrate methods for the probabilistic analysis and design of fatigue-sensitive wind turbine components. A computer program (CYCLES) that estimates fatigue reliability of structural and mechanical components has been developed. A FORM/SORM analysis is used to compute failure probabilities and importance factors of the random variables. The limit state equation includes uncertainty in environmental loading, gross structural response, and local fatigue properties. Several techniques are shown to better study fatigue loads data. Common one-parameter models, such as the Rayleigh and exponential models are shown to produce dramatically different estimates of load distributions and fatigue damage. Improved fits may be achieved with the two-parameter Weibull model. High b values require better modeling of relatively large stress ranges; this is effectively done by matching at least two moments (Weibull) and better by matching still higher moments. For this purpose, a new, four-moment {open_quotes}generalized Weibull{close_quotes} model is introduced. Load and resistance factor design (LRFD) methodology for design against fatigue is proposed and demonstrated using data from two horizontal-axis wind turbines. To estimate fatigue damage, wind turbine blade loads have been represented by their first three statistical moments across a range of wind conditions. Based on the moments {mu}{sub 1}{hor_ellipsis}{mu}{sub 3}, new {open_quotes}quadratic Weibull{close_quotes} load distribution models are introduced. The fatigue reliability is found to be notably affected by the choice of load distribution model.

  16. Impact damage in composite laminates

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    1988-01-01

    Damage tolerance requirements have become an important consideration in the design and fabrication of composite structural components for modern aircraft. The ability of a component to contain a flaw of a given size without serious loss of its structural integrity is of prime concern. Composite laminates are particularly susceptible to damage caused by transverse impact loading. The ongoing program described is aimed at developing experimental and analytical methods that can be used to assess damage tolerance capabilities in composite structures subjected to impulsive loading. Some significant results of this work and the methodology used to obtain them are outlined.

  17. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  18. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  19. Probabilistic Cloning and Quantum Computation

    NASA Astrophysics Data System (ADS)

    Gao, Ting; Yan, Feng-Li; Wang, Zhi-Xi

    2004-06-01

    We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning. In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.

  20. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  1. Astrocyte-mediated ischemic tolerance.

    PubMed

    Hirayama, Yuri; Ikeda-Matsuo, Yuri; Notomi, Shoji; Enaida, Hiroshi; Kinouchi, Hiroyuki; Koizumi, Schuichi

    2015-03-01

    Preconditioning (PC) using a preceding sublethal ischemic insult is an attractive strategy for protecting neurons by inducing ischemic tolerance in the brain. Although the underlying molecular mechanisms have been extensively studied, almost all studies have focused on neurons. Here, using a middle cerebral artery occlusion model in mice, we show that astrocytes play an essential role in the induction of brain ischemic tolerance. PC caused activation of glial cells without producing any noticeable brain damage. The spatiotemporal pattern of astrocytic, but not microglial, activation correlated well with that of ischemic tolerance. Interestingly, such activation in astrocytes lasted at least 8 weeks. Importantly, inhibiting astrocytes with fluorocitrate abolished the induction of ischemic tolerance. To investigate the underlying mechanisms, we focused on the P2X7 receptor as a key molecule in astrocyte-mediated ischemic tolerance. P2X7 receptors were dramatically upregulated in activated astrocytes. PC-induced ischemic tolerance was abolished in P2X7 receptor knock-out mice. Moreover, our results suggest that hypoxia-inducible factor-1α, a well known mediator of ischemic tolerance, is involved in P2X7 receptor-mediated ischemic tolerance. Unlike previous reports focusing on neuron-based mechanisms, our results show that astrocytes play indispensable roles in inducing ischemic tolerance, and that upregulation of P2X7 receptors in astrocytes is essential. PMID:25740510

  2. Composite Structures Damage Tolerance Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Chang, James B.; Goyal, Vinay K.; Klug, John C.; Rome, Jacob I.

    2012-01-01

    This report presents the results of a literature review as part of the development of composite hardware fracture control guidelines funded by NASA Engineering and Safety Center (NESC) under contract NNL04AA09B. The objectives of the overall development tasks are to provide a broad information and database to the designers, analysts, and testing personnel who are engaged in space flight hardware production.

  3. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  4. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  5. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  6. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  7. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  8. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  9. A probabilistic atlas of the cerebellar white matter.

    PubMed

    van Baarsen, K M; Kleinnijenhuis, M; Jbabdi, S; Sotiropoulos, S N; Grotenhuis, J A; van Cappellen van Walsum, A M

    2016-01-01

    Imaging of the cerebellar cortex, deep cerebellar nuclei and their connectivity are gaining attraction, due to the important role the cerebellum plays in cognition and motor control. Atlases of the cerebellar cortex and nuclei are used to locate regions of interest in clinical and neuroscience studies. However, the white matter that connects these relay stations is of at least similar functional importance. Damage to these cerebellar white matter tracts may lead to serious language, cognitive and emotional disturbances, although the pathophysiological mechanism behind it is still debated. Differences in white matter integrity between patients and controls might shed light on structure-function correlations. A probabilistic parcellation atlas of the cerebellar white matter would help these studies by facilitating automatic segmentation of the cerebellar peduncles, the localization of lesions and the comparison of white matter integrity between patients and controls. In this work a digital three-dimensional probabilistic atlas of the cerebellar white matter is presented, based on high quality 3T, 1.25mm resolution diffusion MRI data from 90 subjects participating in the Human Connectome Project. The white matter tracts were estimated using probabilistic tractography. Results over 90 subjects were symmetrical and trajectories of superior, middle and inferior cerebellar peduncles resembled the anatomy as known from anatomical studies. This atlas will contribute to a better understanding of cerebellar white matter architecture. It may eventually aid in defining structure-function correlations in patients with cerebellar disorders.

  10. Probabilistic cloning of three nonorthogonal states

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Rui, Pinshu; Yang, Qun; Zhao, Yan; Zhang, Ziyun

    2015-04-01

    We study the probabilistic cloning of three nonorthogonal states with equal success probabilities. For simplicity, we assume that the three states belong to a special set. Analytical form of the maximal success probability for probabilistic cloning is calculated. With the maximal success probability, we deduce the explicit form of probabilistic quantum cloning machine. In the case of cloning, we get the unambiguous form of the unitary operation. It is demonstrated that the upper bound for probabilistic quantum cloning machine in (Qiu in J Phys A 35:6931, 2002) can be reached only if the three states are equidistant.

  11. Probabilistic Planning with Imperfect Sensing Actions Using Hybrid Probabilistic Logic Programs

    NASA Astrophysics Data System (ADS)

    Saad, Emad

    Effective planning in uncertain environment is important to agents and multi-agents systems. In this paper, we introduce a new logic based approach to probabilistic contingent planning (probabilistic planning with imperfect sensing actions), by relating probabilistic contingent planning to normal hybrid probabilistic logic programs with probabilistic answer set semantics [24]. We show that any probabilistic contingent planning problem can be encoded as a normal hybrid probabilistic logic program. We formally prove the correctness of our approach. Moreover, we show that the complexity of finding a probabilistic contingent plan in our approach is NP-complete. In addition, we show that any probabilistic contingent planning problem, \\cal PP, can be encoded as a classical normal logic program with answer set semantics, whose answer sets corresponds to valid trajectories in \\cal PP. We show that probabilistic contingent planning problems can be encoded as SAT problems. We present a new high level probabilistic action description language that allows the representation of sensing actions with probabilistic outcomes.

  12. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  13. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  14. Against all odds -- Probabilistic forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  15. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  16. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  17. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  18. 7 CFR 51.306 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Tolerances § 51.306 Tolerances. In...: (1) U.S. Extra Fancy, U.S. Fancy, U.S. No. 1, and U.S. No. 1 Hail grades: 10 percent of the apples in... 5 percent, shall be allowed for apples which are seriously damaged, including therein not more...

  19. Probabilistic Analysis of Rechargeable Batteries in a Photovoltaic Power Supply System

    SciTech Connect

    Barney, P.; Ingersoll, D.; Jungst, R.; O'Gorman, C.; Paez, T.L.; Urbina, A.

    1998-11-24

    We developed a model for the probabilistic behavior of a rechargeable battery acting as the energy storage component in a photovoltaic power supply system. Stochastic and deterministic models are created to simulate the behavior of the system component;. The components are the solar resource, the photovoltaic power supply system, the rechargeable battery, and a load. Artificial neural networks are incorporated into the model of the rechargeable battery to simulate damage that occurs during deep discharge cycles. The equations governing system behavior are combined into one set and solved simultaneously in the Monte Carlo framework to evaluate the probabilistic character of measures of battery behavior.

  20. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  1. Transplantation tolerance.

    PubMed

    Salisbury, Emma M; Game, David S; Lechler, Robert I

    2014-12-01

    Although transplantation has been a standard medical practice for decades, marked morbidity from the use of immunosuppressive drugs and poor long-term graft survival remain important limitations in the field. Since the first solid organ transplant between the Herrick twins in 1954, transplantation immunology has sought to move away from harmful, broad-spectrum immunosuppressive regimens that carry with them the long-term risk of potentially life-threatening opportunistic infections, cardiovascular disease, and malignancy, as well as graft toxicity and loss, towards tolerogenic strategies that promote long-term graft survival. Reports of "transplant tolerance" in kidney and liver allograft recipients whose immunosuppressive drugs were discontinued for medical or non-compliant reasons, together with results from experimental models of transplantation, provide the proof-of-principle that achieving tolerance in organ transplantation is fundamentally possible. However, translating the reconstitution of immune tolerance into the clinical setting is a daunting challenge fraught with the complexities of multiple interacting mechanisms overlaid on a background of variation in disease. In this article, we explore the basic science underlying mechanisms of tolerance and review the latest clinical advances in the quest for transplantation tolerance. PMID:24213880

  2. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  3. 7 CFR 51.2954 - Tolerances for grade defects.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... damaged by mold or insects or seriously damaged by other means, of which not more than 5/6 or 5 pct may be damaged by insects, but no part of any tolerance shall be allowed for walnuts containing live insects No... adhering hulls 15 pct total, by count, including not more than 8 pct which are damaged by mold or...

  4. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  5. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  6. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  7. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  8. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  9. Is probabilistic evidence a source of knowledge?

    PubMed

    Friedman, Ori; Turri, John

    2015-07-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B). Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by "judgment and decision making" researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case (Experiments 3A and 3B).

  10. Neural networks for damage identification

    SciTech Connect

    Paez, T.L.; Klenke, S.E.

    1997-11-01

    Efforts to optimize the design of mechanical systems for preestablished use environments and to extend the durations of use cycles establish a need for in-service health monitoring. Numerous studies have proposed measures of structural response for the identification of structural damage, but few have suggested systematic techniques to guide the decision as to whether or not damage has occurred based on real data. Such techniques are necessary because in field applications the environments in which systems operate and the measurements that characterize system behavior are random. This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework; it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  11. Intolerant tolerance.

    PubMed

    Khushf, G

    1994-04-01

    The Hyde Amendment and Roman Catholic attempts to put restrictions on Title X funding have been criticized for being intolerant. However, such criticism fails to appreciate that there are two competing notions of tolerance, one focusing on the limits of state force and accepting pluralism as unavoidable, and the other focusing on the limits of knowledge and advancing pluralism as a good. These two types of tolerance, illustrated in the writings of John Locke and J.S. Mill, each involve an intolerance. In a pluralistic context where the free exercise of religion is respected, John Locke's account of tolerance is preferable. However, it (in a reconstructed form) leads to a minimal state. Positive entitlements to benefits like artificial contraception or nontherapeutic abortions can legitimately be resisted, because an intolerance has already been shown with respect to those that consider the benefit immoral, since their resources have been coopted by taxation to advance an end that is contrary to their own. There is a sliding scale from tolerance (viewed as forbearance) to the affirmation of communal integrity, and this scale maps on to the continuum from negative to positive rights.

  12. 7 CFR 51.2544 - Tolerances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 6 6 (b) Serious Damage (Minor Insect or Vertebrate Injury, Mold, Rancid, Decay) 3 4 4 4 4 4 (1) Insect Damage, included in (b) 1 2 2 2 2 2 (c) Total Internal Defects 4 8 9 9 9 9 Table III—Tolerances... 1 1 1 (b) Foreign material (No glass, metal or live insects shall be permitted) .25 .25 .25 .25...

  13. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  14. Introducing a probabilistic Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, P.; Gudmundsson, L.; Orlowsky, B.; Seneviratne, S. I.

    2015-04-01

    Water availability is of importance for a wide range of ecological, climatological, and socioeconomic applications. Over land, the partitioning of precipitation into evapotranspiration and runoff essentially determines the availability of water. At mean annual catchment scales, the widely used Budyko framework provides a simple, deterministic, first-order relationship to estimate this partitioning as a function of the prevailing climatic conditions. Here we extend the framework by introducing a method to specify probabilistic estimates of water availability that account for the nonlinearity of the underlying phase space. The new framework allows to evaluate the predictability of water availability that is related to varying catchment characteristics and conditional on the underlying climatic conditions. Corresponding results support the practical experience of low predictability of river runoff in transitional climates.

  15. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  16. Imprecise probabilistic estimation of design floods with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-06-01

    An imprecise probabilistic framework for design flood estimation is proposed on the basis of the Dempster-Shafer theory to handle different epistemic uncertainties from data, probability distribution functions, and probability distribution parameters. These uncertainties are incorporated in cost-benefit analysis to generate the lower and upper bounds of the total cost for flood control, thus presenting improved information for decision making on design floods. Within the total cost bounds, a new robustness criterion is proposed to select a design flood that can tolerate higher levels of uncertainty. A variance decomposition approach is used to quantify individual and interactive impacts of the uncertainty sources on total cost. Results from three case studies, with 127, 104, and 54 year flood data sets, respectively, show that the imprecise probabilistic approach effectively combines aleatory and epistemic uncertainties from the various sources and provides upper and lower bounds of the total cost. Between the total cost and the robustness of design floods, a clear trade-off which is beyond the information that can be provided by the conventional minimum cost criterion is identified. The interactions among data, distributions, and parameters have a much higher contribution than parameters to the estimate of the total cost. It is found that the contributions of the various uncertainty sources and their interactions vary with different flood magnitude, but remain roughly the same with different return periods. This study demonstrates that the proposed methodology can effectively incorporate epistemic uncertainties in cost-benefit analysis of design floods.

  17. Replicating Damaged DNA in Eukaryotes

    PubMed Central

    Chatterjee, Nimrat; Siede, Wolfram

    2013-01-01

    DNA damage is one of many possible perturbations that challenge the mechanisms that preserve genetic stability during the copying of the eukaryotic genome in S phase. This short review provides, in the first part, a general introduction to the topic and an overview of checkpoint responses. In the second part, the mechanisms of error-free tolerance in response to fork-arresting DNA damage will be discussed in some detail. PMID:24296172

  18. Replicating damaged DNA in eukaryotes.

    PubMed

    Chatterjee, Nimrat; Siede, Wolfram

    2013-12-01

    DNA damage is one of many possible perturbations that challenge the mechanisms that preserve genetic stability during the copying of the eukaryotic genome in S phase. This short review provides, in the first part, a general introduction to the topic and an overview of checkpoint responses. In the second part, the mechanisms of error-free tolerance in response to fork-arresting DNA damage will be discussed in some detail.

  19. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  20. Probabilistic cognition in two indigenous Mayan groups.

    PubMed

    Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Girotto, Vittorio

    2014-12-01

    Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K'iche', two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition.

  1. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  2. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery. PMID:26017444

  3. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. Probabilistic cognition in two indigenous Mayan groups

    PubMed Central

    Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Girotto, Vittorio

    2014-01-01

    Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K’iche’, two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition. PMID:25368160

  5. Probabilistic and Non-probabilistic Synthetic Reliability Model for Space Structures

    NASA Astrophysics Data System (ADS)

    Hong, Dongpao; Hu, Xiao; Zhang, Jing

    2016-07-01

    As an alternative to reliability analysis, the non-probabilistic model is an effective supplement when the interval information exists. We describe the uncertain parameters of the structures with interval variables, and establish a non-probabilistic reliability model of structures. Then, we analyze the relation between the typical interference mode and the reliability according to the structure stress-strength interference model, and propose a new measure of structure non-probabilistic reliability. Furthermore we describe other uncertain parameters with random variables when probabilistic information also exists. For the complex structures including both random variables and interval variables, we propose a probabilistic and non-probabilistic synthetic reliability model. The illustrative example shows that the presented model is feasible for structure reliability analysis and design.

  6. Infectious tolerance.

    PubMed

    Cobbold, S; Waldmann, H

    1998-10-01

    Infectious tolerance can be induced in many ways, does not require a thymus or clonal deletion and can spread to third-party antigens linked on the same antigen-presenting cell-the process being variously described as linked-, bystanderor epitope-suppression. We here review the evidence concerning the mechanisms involved and attempt to make a consistent hypothesis, that during tolerance induction in the Th1-mediated autoimmune diseases and transplantation systems there would seem to be a phase of immune deviation towards Th2 cytokines, like IL-4 and IL-10; however, this may lead to an IL-10-induced form of anergy or nonresponsiveness and generation of the recently characterized Th3/T-regulatory-1 CD4+ T cell subset which is thought to downregulate the antigen-presenting cell, possibly via transforming growth factor beta. PMID:9794831

  7. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  8. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  9. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  10. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  11. PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT

    EPA Science Inventory

    Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...

  12. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  13. Probabilistic structural analysis for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    1993-01-01

    Viewgraphs of probabilistic structural analysis for nuclear thermal propulsion are presented. The objective of the study was to develop a methodology to certify Space Nuclear Propulsion System (SNPS) Nozzle with assured reliability. Topics covered include: advantage of probabilistic structural analysis; space nuclear propulsion system nozzle uncertainties in the random variables; SNPS nozzle natural frequency; and sensitivity of primitive variable uncertainties SNPS nozzle natural frequency and shell stress.

  14. Probabilistic analysis of deposit liquefaction

    SciTech Connect

    Loh, C.H.; Cheng, C.R.; Wen, Y.K.

    1995-12-31

    This paper presents a procedure to perform the risk analysis for ground failure by liquefaction. The liquefaction is defined as the result of cumulative damage caused by seismic loading. The fatigue life of soil can be determined on the basis of the N-S relationship and Miner`s cumulative damage law. The rain-flow method is used to count the number of cycles of stress response of the soil deposit. Finally, the probability of liquefaction is obtained by integrating over all the possible ground motion and the fragility curves of liquefaction potential.

  15. Probabilistic Choice, Reversibility, Loops, and Miracles

    NASA Astrophysics Data System (ADS)

    Stoddart, Bill; Bell, Pete

    We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).

  16. Symbolic representation of probabilistic worlds.

    PubMed

    Feldman, Jacob

    2012-04-01

    Symbolic representation of environmental variables is a ubiquitous and often debated component of cognitive science. Yet notwithstanding centuries of philosophical discussion, the efficacy, scope, and validity of such representation has rarely been given direct consideration from a mathematical point of view. This paper introduces a quantitative measure of the effectiveness of symbolic representation, and develops formal constraints under which such representation is in fact warranted. The effectiveness of symbolic representation hinges on the probabilistic structure of the environment that is to be represented. For arbitrary probability distributions (i.e., environments), symbolic representation is generally not warranted. But in modal environments, defined here as those that consist of mixtures of component distributions that are narrow ("spiky") relative to their spreads, symbolic representation can be shown to represent the environment with a relatively negligible loss of information. Modal environments support propositional forms, logical relations, and other familiar features of symbolic representation. Hence the assumption that our environment is, in fact, modal is a key tacit assumption underlying the use of symbols in cognitive science. PMID:22270145

  17. Probabilistic computation by neuromine networks.

    PubMed

    Hangartner, R D; Cull, P

    2000-01-01

    In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.

  18. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  19. Representation of probabilistic scientific knowledge.

    PubMed

    Soldatova, Larisa N; Rzhetsky, Andrey; De Grave, Kurt; King, Ross D

    2013-04-15

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO. PMID:23734675

  20. Representation of probabilistic scientific knowledge

    PubMed Central

    2013-01-01

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO PMID:23734675

  1. Dynamical systems probabilistic risk assessment.

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  2. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  3. Probabilistic description of traffic flow

    NASA Astrophysics Data System (ADS)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.

  4. MOND using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raut, Usha

    2009-05-01

    MOND has been proposed as a viable alternative to the dark matter hypothesis. In the original MOND formulation [1], a modification of Newtonian Dynamics was brought about by postulating new equations of particle motion at extremely low accelerations, as a possible explanation for the flat rotation curves of spiral galaxies. In this paper, we attempt a different approach to modify the usual force laws by trying to link gravity with the probabilistic aspects of quantum mechanics [2]. In order to achieve this, one starts by replacing the classical notion of a continuous distance between two elementary particles with a statistical probability function, π. The gravitational force between two elementary particles then can be interpreted in terms of the probability of interaction between them. We attempt to show that such a modified gravitational force would fall off a lot slower than the usual inverse square law predicts, leading to revised MOND equations. In the limit that the statistical aggregate of the probabilities becomes equal to the usual inverse square law force, we recover Newtonian/Einstein gravity.[3pt] [1] Milgrom, M. 1983, ApJ, 270, 365 [2] Goradia, S. 2002, .org/pdf/physics/0210040

  5. A novel Bayesian imaging method for probabilistic delamination detection of composite materials

    NASA Astrophysics Data System (ADS)

    Peng, Tishun; Saxena, Abhinav; Goebel, Kai; Xiang, Yibing; Sankararaman, Shankar; Liu, Yongming

    2013-12-01

    A probabilistic framework for location and size determination for delamination in carbon-carbon composites is proposed in this paper. A probability image of delaminated area using Lamb wave-based damage detection features is constructed with the Bayesian updating technique. First, the algorithm for the probabilistic delamination detection framework using the proposed Bayesian imaging method (BIM) is presented. Next, a fatigue testing setup for carbon-carbon composite coupons is described. The Lamb wave-based diagnostic signal is then interpreted and processed. Next, the obtained signal features are incorporated in the Bayesian imaging method for delamination size and location detection, as well as the corresponding uncertainty bounds prediction. The damage detection results using the proposed methodology are compared with x-ray images for verification and validation. Finally, some conclusions are drawn and suggestions made for future works based on the study presented in this paper.

  6. Detailed probabilistic modelling of cell inactivation by ionizing radiations of different qualities: the model and its applications.

    PubMed

    Kundrát, Pavel

    2009-03-01

    The probabilistic two-stage model of cell killing by ionizing radiation enables to represent both damage induction by radiation and its repair by the cell. The model properties and applications as well as possible interpretation of the underlying damage classification are discussed. Analyses of published survival data for V79 hamster cells irradiated by protons and He, C, O, and Ne ions are reported, quantifying the variations in radiation quality with increasing charge and linear energy transfer of the ions.

  7. A methodology for post-mainshock probabilistic assessment of building collapse risk

    USGS Publications Warehouse

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  8. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    SciTech Connect

    Rabiti, Cristian; Alfonsi, Andrea; Mandelli, Diego; Cogliati, Joshua; Kinoshita, Robert

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  9. Probabilistic Methodology for Estimation of Number and Economic Loss (Cost) of Future Landslides in the San Francisco Bay Region, California

    USGS Publications Warehouse

    Crovelli, Robert A.; Coe, Jeffrey A.

    2008-01-01

    The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.

  10. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  11. Advanced Seismic Probabilistic Risk Assessment Demonstration Project Plan

    SciTech Connect

    Justin Coleman

    2014-09-01

    Idaho National Laboratories (INL) has an ongoing research and development (R&D) project to remove excess conservatism from seismic probabilistic risk assessments (SPRA) calculations. These risk calculations should focus on providing best estimate results, and associated insights, for evaluation and decision-making. This report presents a plan for improving our current traditional SPRA process using a seismic event recorded at a nuclear power plant site, with known outcomes, to improve the decision making process. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in general this approach has been conservative, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility).

  12. Demonstrate Ames Laboratory capability in Probabilistic Risk Assessment (PRA)

    SciTech Connect

    Bluhm, D.; Greimann, L.; Fanous, F.; Challa, R.; Gupta, S.

    1993-07-01

    In response to the damage which occurred during the Three Mile Island nuclear accident, the Nuclear Regulatory Commission (NRC) has implemented a Probabilistic Risk Assessment (PRA) program to evaluate the safety of nuclear power facilities during events with a low probability of occurrence. The PRA can be defined as a mathematical technique to identify and rank the importance of event sequences that can lead to a severe nuclear accident. Another PRA application is the evaluation of nuclear containment buildings due to earthquakes. In order to perform a seismic PRA, the two conditional probabilities of ground motion and of structural failure of the different components given a specific earthquake are first studied. The first of these is termed probability of exceedance and the second as seismic fragility analysis. The seismic fragility analysis is then related to the ground motion measured in terms of ``g`` to obtain a plant level fragility curve.

  13. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  14. A probabilistic disease-gene finder for personal genomes.

    PubMed

    Yandell, Mark; Huff, Chad; Hu, Hao; Singleton, Marc; Moore, Barry; Xing, Jinchuan; Jorde, Lynn B; Reese, Martin G

    2011-09-01

    VAAST (the Variant Annotation, Analysis & Search Tool) is a probabilistic search tool for identifying damaged genes and their disease-causing variants in personal genome sequences. VAAST builds on existing amino acid substitution (AAS) and aggregative approaches to variant prioritization, combining elements of both into a single unified likelihood framework that allows users to identify damaged genes and deleterious variants with greater accuracy, and in an easy-to-use fashion. VAAST can score both coding and noncoding variants, evaluating the cumulative impact of both types of variants simultaneously. VAAST can identify rare variants causing rare genetic diseases, and it can also use both rare and common variants to identify genes responsible for common diseases. VAAST thus has a much greater scope of use than any existing methodology. Here we demonstrate its ability to identify damaged genes using small cohorts (n = 3) of unrelated individuals, wherein no two share the same deleterious variants, and for common, multigenic diseases using as few as 150 cases.

  15. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  16. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  17. Orchid flowers tolerance to gamma-radiation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Olivia Kimiko

    2000-03-01

    Cut flowers are fresh goods that may be treated with fumigants such as methyl bromide to meet the needs of the quarantine requirements of importing countries. Irradiation is a non-chemical alternative to substitute the methyl bromide treatment of fresh products. In this research, different cut orchids were irradiated to examine their tolerance to gamma-rays. A 200 Gy dose did inhibit the Dendrobium palenopsis buds from opening, but did not cause visible damage to opened flowers. Doses of 800 and 1000 Gy were damaging because they provoked the flowers to drop from the stem. Cattleya irradiated with 750 Gy did not show any damage, and were therefore eligible for the radiation treatment. Cymbidium tolerated up to 300 Gy and above this dose dropped prematurely. On the other hand, Oncydium did not tolerate doses above 150 Gy.

  18. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  19. A probabilistic solution of robust H∞ control problem with scaled matrices

    NASA Astrophysics Data System (ADS)

    Xie, R.; Gong, J. Y.

    2016-07-01

    This paper addresses the robust H∞ control problem with scaled matrices. It is difficult to find a global optimal solution for this non-convex optimisation problem. A probabilistic solution, which can achieve globally optimal robust performance within any pre-specified tolerance, is obtained by using the proposed method based on randomised algorithm. In the proposed method, the scaled H∞ control problem is divided into two parts: (1) assume the scaled matrices be random variables, the scaled H∞ control problem is converted to a convex optimisation problem for the fixed sample of the scaled matrix and a optimal solution corresponding to the fixed sample is obtained; (2) a probabilistic optimal solution is obtained by using the randomised algorithm based on a finite number N optimal solutions, which are obtained in part (1). The analysis shows that the worst case complexity of proposed method is a polynomial.

  20. Fault Tolerant State Machines

    NASA Technical Reports Server (NTRS)

    Burke, Gary R.; Taft, Stephanie

    2004-01-01

    State machines are commonly used to control sequential logic in FPGAs and ASKS. An errant state machine can cause considerable damage to the device it is controlling. For example in space applications, the FPGA might be controlling Pyros, which when fired at the wrong time will cause a mission failure. Even a well designed state machine can be subject to random errors us a result of SEUs from the radiation environment in space. There are various ways to encode the states of a state machine, and the type of encoding makes a large difference in the susceptibility of the state machine to radiation. In this paper we compare 4 methods of state machine encoding and find which method gives the best fault tolerance, as well as determining the resources needed for each method.

  1. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  2. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  3. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  4. Probabilistic Cue Combination: Less is More

    PubMed Central

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2012-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the dilution effect, predictions made from the combination of two cues of different strengths are less accurate than those made from the stronger cue alone. Here we show that dilution is an adult problem; 11-month-old infants combine strong and weak predictors normatively. These results extend and add support for the less is more hypothesis: limited cognitive resources can lead children to represent probabilistic information differently from adults, and this difference in representation can have important downstream consequences for prediction. PMID:23432826

  5. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  6. Probabilistic modeling of subgrade soil strengths

    NASA Astrophysics Data System (ADS)

    Chou, Y. T.

    1981-09-01

    A concept of spatial average in probabilistic modeling of subgrade soil strength is presented. The advantage of the application of spatial average to pavement engineering is explained. The link between the concept and the overall probability-based pavement design procedure is formulated and explained. In the earlier part of the report, a literature review of the concept and procedure of probabilistic design of pavements, which includes the concepts of variations and reliability, is presented. Finally, an outline of a probability based pavement design procedure for the Corps of Engineers is presented.

  7. Recent Advances in Composite Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Reifsnider, Ken; Case, Scott; Iyengar, Nirmal

    1996-01-01

    The state of the art and recent developments in the field of composite material damage mechanics are reviewed, with emphasis on damage accumulation. The kinetics of damage accumulation are considered with emphasis on the general accumulation of discrete local damage events such as single or multiple fiber fractures or microcrack formation. The issues addressed include: how to define strength in the presence of widely distributed damage, and how to combine mechanical representations in order to predict the damage tolerance and life of engineering components. It is shown that a damage mechanics approach can be related to the thermodynamics of the damage accumulation processes in composite laminates subjected to mechanical loading and environmental conditions over long periods of time.

  8. Transplantation tolerance

    PubMed Central

    Muller, Yannick D; Seebach, Jörg D; Bühler, Leo H; Pascual, Manuel

    2011-01-01

    The major challenge in transplantation medicine remains long-term allograft acceptance, with preserved allograft function under minimal chronic immunosuppression. To safely achieve the goal of sustained donor-specific T and B cell non-responsiveness, research efforts are now focusing on therapies based on cell subsets with regulatory properties. In particular the transfusion of human regulatory T cells (Treg) is currently being evaluated in phase I/II clinical trials for the treatment of graft versus host disease following hematopoietic stem cell transplantation, and is also under consideration for solid organ transplantation. The purpose of this review is to recapitulate current knowledge on naturally occurring as well as induced human Treg, with emphasis on their specific phenotype, suppressive function and how these cells can be manipulated in vitro and/or in vivo for therapeutic purposes in transplantation medicine. We highlight the potential but also possible limitations of Treg-based strategies to promote long-term allograft survival. It is evident that the bench-to-beside translation of these protocols still requires further understanding of Treg biology. Nevertheless, current data already suggest that Treg therapy alone will not be sufficient and needs to be combined with other immunomodulatory approaches in order to induce allograft tolerance. PMID:21776332

  9. Low Velocity Impact Damage to Carbon/Epoxy Laminates

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2011-01-01

    Impact damage tends to be more detrimental to a laminate's compression strength as compared to tensile strength. Proper use of Non Destructive Evaluation (NDE) Techniques can remove conservatism (weight) from many structures. Test largest components economically feasible as coupons. If damage tolerance is a driver, then consider different resin systems. Do not use a single knockdown factor to account for damage.

  10. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  11. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  12. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  13. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  14. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  15. Probabilistic classification learning in Tourette syndrome.

    PubMed

    Kéri, Szabolcs; Szlobodnyik, Csaba; Benedek, György; Janka, Zoltán; Gádoros, Júlia

    2002-01-01

    Tourette syndrome (TS) is characterised by stereotyped involuntary movements, called tics. Some evidence suggests that structural and functional abnormalities of the basal ganglia may explain these motor symptoms. In this study, the probabilistic classification learning (PCL) test was used to evaluate basal ganglia functions in 10 children with less severe tics (Yale Global Tic Severity Scale (YGTSS) scores<30) and in 10 children with more severe symptoms (YGTSS score>30). In the PCL task, participants are asked to decide whether different combinations of four geometric forms (cues) predict rainy or sunny weather. Each cue is probabilistically related to a weather outcome, and feedback is provided after each decision. After completion of the probabilistic stimulus-response learning procedure, subjects received a transfer test to assess explicit knowledge about the cues. The children with TS exhibited impaired learning in the PCL task in comparison with the 20 healthy control subjects. This impairment was more pronounced in the TS patients with severe symptoms, and there was a significant negative relationship between the final classification performance and the YGTSS scores. The patients showed normal learning in the transfer test. These results suggest that the neostriatal habit learning system, which may play a central role in the acquisition of probabilistic associations, is dysfunctional in TS, especially in the case of more severe motor symptoms. The classification performance and the severity of tics were independent of the explicit knowledge obtained during the test.

  16. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  17. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  18. Probabilistic Scale-Space Filtering Program

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos

    1993-01-01

    Probabilistic Scale-Space Filtering (PSF) computer program implements scale-space technique to describe input signals as collections of nested hills and valleys organized in treelike structure. Helps to construct sparse representations of complicated signals. Calculates probabilities, with extracted features corresponding to physical processes. Written in C language (49 percent) and Common Lisp (51 percent).

  19. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  20. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  1. Advanced neutron source reactor probabilistic flow blockage assessment

    SciTech Connect

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.

  2. Advanced Test Reactor probabilistic risk assessment methodology and results summary

    SciTech Connect

    Eide, S.A.; Atkinson, S.A.; Thatcher, T.A.

    1992-01-01

    The Advanced Test Reactor (ATR) probabilistic risk assessment (PRA) Level 1 report documents a comprehensive and state-of-the-art study to establish and reduce the risk associated with operation of the ATR, expressed as a mean frequency of fuel damage. The ATR Level 1 PRA effort is unique and outstanding because of its consistent and state-of-the-art treatment of all facets of the risk study, its comprehensive and cost-effective risk reduction effort while the risk baseline was being established, and its thorough and comprehensive documentation. The PRA includes many improvements to the state-of-the-art, including the following: establishment of a comprehensive generic data base for component failures, treatment of initiating event frequencies given significant plant improvements in recent years, performance of efficient identification and screening of fire and flood events using code-assisted vital area analysis, identification and treatment of significant seismic-fire-flood-wind interactions, and modeling of large loss-of-coolant accidents (LOCAs) and experiment loop ruptures leading to direct damage of the ATR core. 18 refs.

  3. The coevolutionary implications of host tolerance.

    PubMed

    Best, Alex; White, Andy; Boots, Mike

    2014-05-01

    Host tolerance to infectious disease, whereby hosts do not directly "fight" parasites but instead ameliorate the damage caused, is an important defense mechanism in both plants and animals. Because tolerance to parasite virulence may lead to higher prevalence of disease in a population, evolutionary theory tells us that while the spread of resistance genes will result in negative frequency dependence and the potential for diversification, the evolution of tolerance is instead likely to result in fixation. However, our understanding of the broader implications of tolerance is limited by a lack of fully coevolutionary theory. Here we examine the coevolution of tolerance across a comprehensive range of classic coevolutionary host-parasite frameworks, including equivalents of gene-for-gene and matching allele and evolutionary invasion models. Our models show that the coevolution of host tolerance and parasite virulence does not lead to the generation and maintenance of diversity through either static polymorphisms or through "Red-queen" cycles. Coevolution of tolerance may however lead to multiple stable states leading to sudden shifts in parasite impacts on host health. More broadly, we emphasize that tolerance may change host-parasite interactions from antagonistic to a form of "apparent commensalism," but may also lead to the evolution of parasites that are highly virulent in nontolerant hosts.

  4. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... than 3 times the tolerance specified, except that at least three defective apples may be permitted...

  5. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... than 3 times the tolerance specified, except that at least three defective apples may be permitted...

  6. 7 CFR 51.307 - Application of tolerances.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... STANDARDS) United States Standards for Grades of Apples Application of Tolerances § 51.307 Application of... least one apple which is seriously damaged by insects or affected by decay or internal breakdown may be... than 3 times the tolerance specified, except that at least three defective apples may be permitted...

  7. 7 CFR 51.2648 - Tolerances.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades for Sweet Cherries 1 Tolerances § 51.2648... 2 —(1) U.S. No. 1. 8 percent for cherries which fail to meet the requirements for this grade... damage, including in this latter amount not more than one-half of 1 percent for cherries which...

  8. Can crops tolerate acid rain

    SciTech Connect

    Kaplan, J.K.

    1989-11-01

    This brief article describes work by scientists at the ARS Air Quality-Plant Growth and Development Laboratory in Raleigh, North Carolina, that indicates little damage to crops as a result of acid rain. In studies with simulated acid rain and 216 exposed varieties of 18 crops, there were no significant injuries nor was there reduced growth in most species. Results of chronic and acute exposures were correlated in sensitive tomato and soybean plants and in tolerant winter wheat and lettuce plants. These results suggest that 1-hour exposures could be used in the future to screen varieties for sensitivity to acid rain.

  9. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  10. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  11. Cortical Correspondence with Probabilistic Fiber Connectivity

    PubMed Central

    Oguz, Ipek; Niethammer, Marc; Cates, Josh; Whitaker, Ross; Fletcher, Thomas; Vachet, Clement; Styner, Martin

    2009-01-01

    This paper presents a novel method of optimizing point-based correspondence among populations of human cortical surfaces by combining structural cues with probabilistic connectivity maps. The proposed method establishes a tradeoff between an even sampling of the cortical surfaces (a low surface entropy) and the similarity of corresponding points across the population (a low ensemble entropy). The similarity metric, however, isn’t constrained to be just spatial proximity, but uses local sulcal depth measurements as well as probabilistic connectivity maps, computed from DWI scans via a stochastic tractography algorithm, to enhance the correspondence definition. We propose a novel method for projecting this fiber connectivity information on the cortical surface, using a surface evolution technique. Our cortical correspondence method does not require a spherical parameterization. Experimental results are presented, showing improved correspondence quality demonstrated by a cortical thickness analysis, as compared to correspondence methods using spatial metrics as the sole correspondence criterion. PMID:19694301

  12. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  13. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  14. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  15. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  16. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2005-12-13

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  17. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Mulligan, Anthony C.; Rigali, Mark J.; Sutaria, Manish P.; Popovich, Dragan; Halloran, Joseph P.; Fulcher, Michael L.; Cook, Randy C.

    2009-04-14

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  18. Aligned composite structures for mitigation of impact damage and resistance to wear in dynamic environments

    DOEpatents

    Rigali, Mark J.; Sutaria, Manish P.; Mulligan, Anthony C.; Popovich, Dragan

    2004-03-23

    Fibrous monolith composites having architectures that provide increased flaw insensitivity, improved hardness, wear resistance and damage tolerance and methods of manufacture thereof are provided for use in dynamic environments to mitigate impact damage and increase wear resistance.

  19. Impact damage of a graphite/PEEK

    SciTech Connect

    Demuts, E.

    1994-12-31

    Low-velocity non-penetrating impact has been applied to graphite polyetheretherketone (AS4/APC-2) laminates in accordance with the USAF guidelines for designing damage tolerant primary structures. The extent of delaminations and dent depths for two lay ups and five thicknesses at room temperature and ambient moisture conditions have been determined. Based on these findings as well as those presented elsewhere it may be concluded that the ``softer`` lay up (40/50/10), up to about 75-ply thickness, is more damage tolerant than the ``harder`` lay up (60/30/10) because within this thickness range the ``softer`` lay up displays smaller dent depths, smaller delaminated areas and higher post-impost compressive strength (PICS). For laminates thicker than 75 plies, the relative situation in delamination extent and PICS is reversed, i.e. the ``harder`` lay up is more damage tolerant than the ``softer`` one. The test data obtained in this experimental investigation provide the amount of initial damage to be assumed for a damage tolerant design of USAF primary structures made out of AS4/APC-2 graphite/PEEK. In addition, 9 these data may serve to validate the predictive capability of appropriate analytic models.

  20. Initial guidelines for probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.

    1994-10-01

    In the late 1980s, the methodology for performing probabilistic seismic hazard analysis (PSHA) was exercised extensively for eastern-U.S. nuclear power plant sites by the Electric Power Research Institute (EPRI) and Lawrence Livermore National Laboratory (LLNL) under NRC sponsorship. Unfortunately, the seismic-hazard-curve results of these two studies differed substantially for many of the eastern reactor sites, which has motivated all concerned to revisit the approaches taken. This project is that revisitation.

  1. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  2. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  3. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  4. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  5. Probabilistic design of advanced composite structure

    NASA Technical Reports Server (NTRS)

    Gray, P. M.; Riskalla, M. G.

    1992-01-01

    Advanced composite technology offers potentials for sizable improvements in many areas: weight savings, maintainability, durability, and reliability. However, there are a number of inhibitors to these improvements. One of the biggest inhibitors is the imposition of traditional metallic approaches to design of composite structure. This is especially detrimental in composites because new materials technology demands new design approaches. Of particular importance are the decisions made regarding structural criteria. Significant changes cannot be implemented without careful consideration and exploration. This new approach is to implement changes on a controlled, verifiable basis. Probabilistic design is the methodology and the process to accomplish this. Its foundation is to base design criteria and objectives on reliability targets instead of arbitrary factors carried over from metallic structural history. The background is discussed of probabilistic design and the results are presented of a side-by-side comparison to generic aircraft structure designed the 'old' way and the 'new'. Activities are also defined that need to be undertaken to evolve available approaches to probabilistic design followed by summary and recommendations.

  6. From deterministic dynamics to probabilistic descriptions

    PubMed Central

    Misra, B.; Prigogine, I.; Courbage, M.

    1979-01-01

    The present work is devoted to the following question: What is the relationship between the deterministic laws of dynamics and probabilistic description of physical processes? It is generally accepted that probabilistic processes can arise from deterministic dynamics only through a process of “coarse graining” or “contraction of description” that inevitably involves a loss of information. In this work we present an alternative point of view toward the relationship between deterministic dynamics and probabilistic descriptions. Speaking in general terms, we demonstrate the possibility of obtaining (stochastic) Markov processes from deterministic dynamics simply through a “change of representation” that involves no loss of information provided the dynamical system under consideration has a suitably high degree of instability of motion. The fundamental implications of this finding for statistical mechanics and other areas of physics are discussed. From a mathematical point of view, the theory we present is a theory of invertible, positivity-preserving, and necessarily nonunitary similarity transformations that convert the unitary groups associated with deterministic dynamics to contraction semigroups associated with stochastic Markov processes. We explicitly construct such similarity transformations for the so-called Bernoulli systems. This construction illustrates also the construction of the so-called Lyapounov variables and the operator of “internal time,” which play an important role in our approach to the problem of irreversibility. The theory we present can also be viewed as a theory of entropy-increasing evolutions and their relationship to deterministic dynamics. PMID:16592691

  7. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  8. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts. PMID:26130236

  9. Probabilistic drought classification using gamma mixture models

    NASA Astrophysics Data System (ADS)

    Mallya, Ganeshchandra; Tripathi, Shivam; Govindaraju, Rao S.

    2015-07-01

    Drought severity is commonly reported using drought classes obtained by assigning pre-defined thresholds on drought indices. Current drought classification methods ignore modeling uncertainties and provide discrete drought classification. However, the users of drought classification are often interested in knowing inherent uncertainties in classification so that they can make informed decisions. Recent studies have used hidden Markov models (HMM) for quantifying uncertainties in drought classification. The HMM method conceptualizes drought classes as distinct hydrological states that are not observed (hidden) but affect observed hydrological variables. The number of drought classes or hidden states in the model is pre-specified, which can sometimes result in model over-specification problem. This study proposes an alternate method for probabilistic drought classification where the number of states in the model is determined by the data. The proposed method adapts Standard Precipitation Index (SPI) methodology of drought classification by employing gamma mixture model (Gamma-MM) in a Bayesian framework. The method alleviates the problem of choosing a suitable distribution for fitting data in SPI analysis, quantifies modeling uncertainties, and propagates them for probabilistic drought classification. The method is tested on rainfall data over India. Comparison of the results with standard SPI show important differences particularly when SPI assumptions on data distribution are violated. Further, the new method is simpler and more parsimonious than HMM based drought classification method and can be a viable alternative for probabilistic drought classification.

  10. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  11. Right Hemisphere Brain Damage

    MedlinePlus

    ... Language and Swallowing / Disorders and Diseases Right Hemisphere Brain Damage [ en Español ] What is right hemisphere brain ... right hemisphere brain damage ? What is right hemisphere brain damage? Right hemisphere brain damage (RHD) is damage ...

  12. Increased size of cotton root system does not impart tolerance to Meloidogyne incognita

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Plant tolerance or intolerance to parasitic nematodes represent a spectrum describing the degree of damage inflicted by the nematode on the host plant. Tolerance is typically measured in terms of yield suppression. Instances of plant tolerance to nematodes have been documented in some crops, inclu...

  13. 7 CFR 51.1215 - Application of tolerances to individual packages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Peaches Application of Tolerances § 51.1215 Application of tolerances to individual packages... any lot shall have not more than double the tolerance specified, except that at least one peach which... percentage of defects: Provided, That not more than one peach which is seriously damaged by insects...

  14. 7 CFR 51.1215 - Application of tolerances to individual packages.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Grades of Peaches Application of Tolerances § 51.1215 Application of tolerances to individual packages... any lot shall have not more than double the tolerance specified, except that at least one peach which... percentage of defects: Provided, That not more than one peach which is seriously damaged by insects...

  15. NASA workshop on impact damage to composites

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1991-01-01

    A compilation of slides presented at the NASA Workshop on Impact Damage to Composites held on March 19 and 20, 1991, at the Langley Research Center, Hampton, Virginia is given. The objective of the workshop was to review technology for evaluating impact damage tolerance of composite structures and identify deficiencies. Research, development, design methods, and design criteria were addressed. Actions to eliminate technology deficiencies were developed. A list of those actions and a list of attendees are also included.

  16. Lactose tolerance tests

    MedlinePlus

    Hydrogen breath test for lactose tolerance ... Two common methods include: Lactose tolerance blood test Hydrogen breath test The hydrogen breath test is the preferred method. It measures the amount of hydrogen in the air you breathe out. ...

  17. Conclusions from a probabilistic safety analysis for FRJ-2 (DIDO) and realization of risk minimization measures

    SciTech Connect

    Wolters, J.; Nabbi, R.

    1997-12-01

    Feed and bleed cooling of the FRJ-2 research reactor can reduce the risk of core damage considerably as a probabilistic safety analysis has revealed. The question whether water circulation via the core would be maintained when the water in the tank has reached saturation point has been answered positively by an investigation with the thermohydraulic code CATHENA. A siphon with a water column and the special feature of self-acting restoration of the column after depressurization proved well during tests and will be installed as the relief equipment required to blow off the steam produced by the residual heat of the core during bleed cooling. 4 refs., 9 tabs.

  18. A framework for probabilistic pluvial flood nowcasting for urban areas

    NASA Astrophysics Data System (ADS)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the

  19. Probabilistic structural analysis algorithm development for computational efficiency

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1991-01-01

    The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.

  20. Applications of probabilistic peak-shaving technique in generation planning

    SciTech Connect

    Malik, A.S.; Cory, B.J.; Wijayatunga, P.D.C.

    1999-11-01

    This paper presents two novel applications of probabilistic peak-shaving technique in generation planning, i.e., to simulate efficiently and accurately multiple limited-energy units probabilistically in equivalent load duration curve method and to simulate efficiently the candidate plants, whose different configurations are tested for finding the least-cost generation expansion planning solution. The applications of the technique are demonstrated with the help of two hand calculation examples. An efficient algorithm is also presented to simulate multiple limited-energy units probabilistically, for different hydrological conditions, in a generation mix of hydro-thermal units in probabilistic production costing framework.

  1. Probabilistic alternatives to Bayesianism: the case of explanationism

    PubMed Central

    Douven, Igor; Schupbach, Jonah N.

    2015-01-01

    There has been a probabilistic turn in contemporary cognitive science. Far and away, most of the work in this vein is Bayesian, at least in name. Coinciding with this development, philosophers have increasingly promoted Bayesianism as the best normative account of how humans ought to reason. In this paper, we make a push for exploring the probabilistic terrain outside of Bayesianism. Non-Bayesian, but still probabilistic, theories provide plausible competitors both to descriptive and normative Bayesian accounts. We argue for this general idea via recent work on explanationist models of updating, which are fundamentally probabilistic but assign a substantial, non-Bayesian role to explanatory considerations. PMID:25964769

  2. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  3. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  4. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  5. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  6. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  7. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  8. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  9. Subcortical structure segmentation using probabilistic atlas priors

    NASA Astrophysics Data System (ADS)

    Gouttard, Sylvain; Styner, Martin; Joshi, Sarang; Smith, Rachel G.; Cody Hazlett, Heather; Gerig, Guido

    2007-03-01

    The segmentation of the subcortical structures of the brain is required for many forms of quantitative neuroanatomic analysis. The volumetric and shape parameters of structures such as lateral ventricles, putamen, caudate, hippocampus, pallidus and amygdala are employed to characterize a disease or its evolution. This paper presents a fully automatic segmentation of these structures via a non-rigid registration of a probabilistic atlas prior and alongside a comprehensive validation. Our approach is based on an unbiased diffeomorphic atlas with probabilistic spatial priors built from a training set of MR images with corresponding manual segmentations. The atlas building computes an average image along with transformation fields mapping each training case to the average image. These transformation fields are applied to the manually segmented structures of each case in order to obtain a probabilistic map on the atlas. When applying the atlas for automatic structural segmentation, an MR image is first intensity inhomogeneity corrected, skull stripped and intensity calibrated to the atlas. Then the atlas image is registered to the image using an affine followed by a deformable registration matching the gray level intensity. Finally, the registration transformation is applied to the probabilistic maps of each structures, which are then thresholded at 0.5 probability. Using manual segmentations for comparison, measures of volumetric differences show high correlation with our results. Furthermore, the dice coefficient, which quantifies the volumetric overlap, is higher than 62% for all structures and is close to 80% for basal ganglia. The intraclass correlation coefficient computed on these same datasets shows a good inter-method correlation of the volumetric measurements. Using a dataset of a single patient scanned 10 times on 5 different scanners, reliability is shown with a coefficient of variance of less than 2 percents over the whole dataset. Overall, these validation

  10. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. PMID:24209920

  11. Probabilistic Algorithm for Sampler Siting (PASS)

    2007-05-29

    PASS (Probabilistic Approach to Sampler Siting) optimizes the placement of samplers in buildings. The program exhaustively checks every sampler-network that can be formed, evaluating against user-supplied simulations of the possible release scenarios. The program identifies the networks that maximize the probablity of detecting a release from among the suite of user-supllied scenarios. The user may specify how many networks to report, in order to provide a number of choices in cases where many networks havemore » very similar behavior.« less

  12. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  13. Automatic probabilistic knowledge acquisition from data

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1986-01-01

    A computer program for extracting significant correlations of attributes from masses of data is outlined. This information can then be used to develop a knowledge base for a probabilistic expert system. The method determines the best estimate of joint probabilities of attributes from data put into contingency table form. A major output from the program is a general formula for calculating any probability relation associated with the data. These probability relations can be utilized to form IF-THEN rules with associated probability, useful for expert systems.

  14. A periodic probabilistic photonic cluster state generator

    NASA Astrophysics Data System (ADS)

    Fanto, Michael L.; Smith, A. Matthew; Alsing, Paul M.; Tison, Christopher C.; Preble, Stefan F.; Lott, Gordon E.; Osman, Joseph M.; Szep, Attila; Kim, Richard S.

    2014-10-01

    The research detailed in this paper describes a Periodic Cluster State Generator (PCSG) consisting of a monolithic integrated waveguide device that employs four wave mixing, an array of probabilistic photon guns, single mode sequential entanglers and an array of controllable entangling gates between modes to create arbitrary cluster states. Utilizing the PCSG one is able to produce a cluster state with nearest neighbor entanglement in the form of a linear or square lattice. Cluster state resources of this type have been proven to be able to perform universal quantum computation.

  15. Probabilistic remote state preparation by W states

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Ming; Wang, Yu-Zhu

    2004-02-01

    In this paper we consider a scheme for probabilistic remote state preparation of a general qubit by using W states. The scheme consists of the sender, Alice and two remote receivers Bob and Carol. Alice performs a projective measurement on her qubit in the basis spanned by the state she wants to prepare and its orthocomplement. This allows either Bob or Carol to reconstruct the state with finite success probability. It is shown that for some special ensembles of qubits, the remote state preparation scheme requires only two classical bits, unlike the case in the scheme of quantum teleportation where three classical bits are needed.

  16. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude.

  17. Probabilistic analysis of fires in nuclear plants

    SciTech Connect

    Unione, A.; Teichmann, T.

    1985-01-01

    The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs.

  18. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  19. The probabilistic cell: implementation of a probabilistic inference by the biochemical mechanisms of phototransduction.

    PubMed

    Houillon, Audrey; Bessière, Pierre; Droulez, Jacques

    2010-09-01

    When we perceive the external world, our brain has to deal with the incompleteness and uncertainty associated with sensory inputs, memory and prior knowledge. In theoretical neuroscience probabilistic approaches have received a growing interest recently, as they account for the ability to reason with incomplete knowledge and to efficiently describe perceptive and behavioral tasks. How can the probability distributions that need to be estimated in these models be represented and processed in the brain, in particular at the single cell level? We consider the basic function carried out by photoreceptor cells which consists in detecting the presence or absence of light. We give a system-level understanding of the process of phototransduction based on a bayesian formalism: we show that the process of phototransduction is equivalent to a temporal probabilistic inference in a Hidden Markov Model (HMM), for estimating the presence or absence of light. Thus, the biochemical mechanisms of phototransduction underlie the estimation of the current state probability distribution of the presence of light. A classical descriptive model describes the interactions between the different molecular messengers, ions, enzymes and channel proteins occurring within the photoreceptor by a set of nonlinear coupled differential equations. In contrast, the probabilistic HMM model is described by a discrete recurrence equation. It appears that the binary HMM has a general solution in the case of constant input. This allows a detailed analysis of the dynamics of the system. The biochemical system and the HMM behave similarly under steady-state conditions. Consequently a formal equivalence can be found between the biochemical system and the HMM. Numerical simulations further extend the results to the dynamic case and to noisy input. All in all, we have derived a probabilistic model equivalent to a classical descriptive model of phototransduction, which has the additional advantage of assigning a

  20. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  1. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  2. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  3. Probabilistic deployment for multiple sensor systems

    NASA Astrophysics Data System (ADS)

    Qian, Ming; Ferrari, Silvia

    2005-05-01

    The performance of many multi-sensor systems can be significantly improved by using a priori environmental information and sensor data to plan the movements of sensor platforms that are later deployed with the purpose of improving the quality of the final detection and classification results. However, existing path planning algorithms and ad-hoc data processing (e.g., fusion) techniques do not allow for the systematic treatment of multiple and heterogeneous sensors and their platforms. This paper presents a method that combines Bayesian network inference with probabilistic roadmap (PRM) planners to utilize the information obtained by different sensors and their level of uncertainty. The uncertainty of prior sensed information is represented by entropy values obtained from the Bayesian network (BN) models of the respective sensor measurement processes. The PRM algorithm is modified to utilize the entropy distribution in optimizing the path of posterior sensor platforms that have the following objectives: (1) improve the quality of the sensed information, i.e., through fusion, (2) minimize the distance traveled by the platforms, and (3) avoid obstacles. This so-called Probabilistic Deployment (PD) method is applied to a demining system comprised of ground-penetrating radars (GPR), electromagnetic (EMI), and infrared sensors (IR) installed on ground platforms, to detect and classify buried mines. Numerical simulations show that PD is more efficient than path planning techniques that do not utilize a priori information, such as complete coverage, random coverage method, or PRM methods that do not utilize Bayesian inference.

  4. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  5. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  6. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  7. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation. PMID:27386269

  8. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  9. Probabilistic stellar rotation periods with Gaussian processes

    NASA Astrophysics Data System (ADS)

    Angus, Ruth; Aigrain, Suzanne; Foreman-Mackey, Daniel

    2015-08-01

    Stellar rotation has many applications in the field of exoplanets. High-precision photometry from space-based missions like Kepler and K2 allows us to measure stellar rotation periods directly from light curves. Stellar variability produced by rotation is usually not sinusoidal or perfectly periodic, therefore sine-fitting periodograms are not well suited to rotation period measurement. Autocorrelation functions are often used to extract periodic information from light curves, however uncertainties on rotation periods measured by autocorrelation are difficult to define. A ‘by eye’ check, or a set of heuristic criteria are used to validate measurements and rotation periods are only reported for stars that pass this vetting process. A probabilistic rotation period measurement method, with a suitable generative model bypasses the need for a validation stage and can produce realistic uncertainties. The physics driving the production of variability in stellar light curves is still poorly understood and difficult to model. We therefore use an effective model for stellar variability: a Gaussian process with a quasi-periodic covariance function. By injecting fake signals into Kepler light curves we show that the GP model is well suited to quasi-periodic, non-sinusoidal signals, is capable of modelling noise and physical signals simultaneously and provides probabilistic rotation period measurements with realistic uncertainties.

  10. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2007-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  11. Probabilistic Fatigue Life Analysis of High Density Electronics Packaging

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Kolawa, E. A.; Sutharshana, S.; Newlin, L. E.; Creager, M.

    1996-01-01

    The fatigue of thin film metal interconnections in high density electronics packaging subjected to thermal cycling has been evaluated using a probabilistic fracture mechanics methodology. This probabilistic methodology includes characterization of thin film stress using an experimentally calibrated finite element model and simulation of flaw growth in the thin films using a stochastic crack growth model.

  12. Perception of Speech Reflects Optimal Use of Probabilistic Speech Cues

    ERIC Educational Resources Information Center

    Clayards, Meghan; Tanenhaus, Michael K.; Aslin, Richard N.; Jacobs, Robert A.

    2008-01-01

    Listeners are exquisitely sensitive to fine-grained acoustic detail within phonetic categories for sounds and words. Here we show that this sensitivity is optimal given the probabilistic nature of speech cues. We manipulated the probability distribution of one probabilistic cue, voice onset time (VOT), which differentiates word initial labial…

  13. The Role of Language in Building Probabilistic Thinking

    ERIC Educational Resources Information Center

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  14. Conducting field trials for frost tolerance breeding in cereals.

    PubMed

    Cattivelli, Luigi

    2014-01-01

    Cereal species can be damaged by frost either during winter or at flowering stage. Frost tolerance per se is only a part of the mechanisms that allow the plants to survive during winter; winterhardiness also considers other biotic or physical stresses that challenge the plants during the winter season limiting their survival rate. While frost tolerance can also be tested in controlled environments, winterhardiness can be determined only with field evaluations. Post-heading frost damage occurs from radiation frost events in spring during the reproductive stages. A reliable evaluation of winterhardiness or of post-heading frost damage should be carried out with field trials replicated across years and locations to overcome the irregular occurrence of natural conditions which satisfactorily differentiate genotypes. The evaluation of post-heading frost damage requires a specific attention to plant phenology. The extent of frost damage is usually determined with a visual score at the end of the winter.

  15. A framework for probabilistic atlas-based organ segmentation

    NASA Astrophysics Data System (ADS)

    Dong, Chunhua; Chen, Yen-Wei; Foruzan, Amir Hossein; Han, Xian-Hua; Tateyama, Tomoko; Wu, Xing

    2016-03-01

    Probabilistic atlas based on human anatomical structure has been widely used for organ segmentation. The challenge is how to register the probabilistic atlas to the patient volume. Additionally, there is the disadvantage that the conventional probabilistic atlas may cause a bias toward the specific patient study due to a single reference. Hence, we propose a template matching framework based on an iterative probabilistic atlas for organ segmentation. Firstly, we find a bounding box for the organ based on human anatomical localization. Then, the probabilistic atlas is used as a template to find the organ in this bounding box by using template matching technology. Comparing our method with conventional and recently developed atlas-based methods, our results show an improvement in the segmentation accuracy for multiple organs (p < 0:00001).

  16. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  17. Damaged Skylab

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Saturn V vehicle, carrying the unmarned orbital workshop for the Skylab-1 mission, lifted off successfully and all systems performed normally. Sixty-three seconds into the flight, engineers in the operation support and control center saw an unexpected telemetry indication that signalled that damages occurred on one solar array and the micrometeoroid shield during the launch. The micrometeoroid shield, a thin protective cylinder surrounding the workshop protecting it from tiny space particles and the sun's scorching heat, ripped loose from its position around the workshop. This caused the loss of one solar wing and jammed the other. Still unoccupied, the Skylab was stricken with the loss of the heat shield and sunlight beat mercilessly on the lab's sensitive skin. Internal temperatures soared, rendering the station uninhabitable, threatening foods, medicines, films, and experiments. This image, taken during a fly-around inspection by the Skylab-2 crew, shows a crippled Skylab in orbit. The crew found their home in space to be in serious shape; the heat shield gone, one solar wing gone, and the other jammed. The Marshall Space Flight Center (MSFC) developed, tested, rehearsed, and approved three repair options. These options included a parasol sunshade and a twin-pole sunshade to restore the temperature inside the workshop, and a set of metal cutting tools to free the jammed solar panel.

  18. Tolerance to deer herbivory and resistance to insect herbivores in the common evening primrose (Oenothera biennis).

    PubMed

    Puentes, A; Johnson, M T J

    2016-01-01

    The evolution of plant defence in response to herbivory will depend on the fitness effects of damage, availability of genetic variation and potential ecological and genetic constraints on defence. Here, we examine the potential for evolution of tolerance to deer herbivory in Oenothera biennis while simultaneously considering resistance to natural insect herbivores. We examined (i) the effects of deer damage on fitness, (ii) the presence of genetic variation in tolerance and resistance, (iii) selection on tolerance, (iv) genetic correlations with resistance that could constrain evolution of tolerance and (v) plant traits that might predict defence. In a field experiment, we simulated deer damage occurring early and late in the season, recorded arthropod abundances, flowering phenology and measured growth rate and lifetime reproduction. Our study showed that deer herbivory has a negative effect on fitness, with effects being more pronounced for late-season damage. Selection acted to increase tolerance to deer damage, yet there was low and nonsignificant genetic variation in this trait. In contrast, there was substantial genetic variation in resistance to insect herbivores. Resistance was genetically uncorrelated with tolerance, whereas positive genetic correlations in resistance to insect herbivores suggest there exists diffuse selection on resistance traits. In addition, growth rate and flowering time did not predict variation in tolerance, but flowering phenology was genetically correlated with resistance. Our results suggest that deer damage has the potential to exert selection because browsing reduces plant fitness, but limited standing genetic variation in tolerance is expected to constrain adaptive evolution in O. biennis.

  19. Tolerance to deer herbivory and resistance to insect herbivores in the common evening primrose (Oenothera biennis).

    PubMed

    Puentes, A; Johnson, M T J

    2016-01-01

    The evolution of plant defence in response to herbivory will depend on the fitness effects of damage, availability of genetic variation and potential ecological and genetic constraints on defence. Here, we examine the potential for evolution of tolerance to deer herbivory in Oenothera biennis while simultaneously considering resistance to natural insect herbivores. We examined (i) the effects of deer damage on fitness, (ii) the presence of genetic variation in tolerance and resistance, (iii) selection on tolerance, (iv) genetic correlations with resistance that could constrain evolution of tolerance and (v) plant traits that might predict defence. In a field experiment, we simulated deer damage occurring early and late in the season, recorded arthropod abundances, flowering phenology and measured growth rate and lifetime reproduction. Our study showed that deer herbivory has a negative effect on fitness, with effects being more pronounced for late-season damage. Selection acted to increase tolerance to deer damage, yet there was low and nonsignificant genetic variation in this trait. In contrast, there was substantial genetic variation in resistance to insect herbivores. Resistance was genetically uncorrelated with tolerance, whereas positive genetic correlations in resistance to insect herbivores suggest there exists diffuse selection on resistance traits. In addition, growth rate and flowering time did not predict variation in tolerance, but flowering phenology was genetically correlated with resistance. Our results suggest that deer damage has the potential to exert selection because browsing reduces plant fitness, but limited standing genetic variation in tolerance is expected to constrain adaptive evolution in O. biennis. PMID:26395768

  20. Acid tolerance in amphibians

    SciTech Connect

    Pierce, B.A.

    1985-04-01

    Studies of amphibian acid tolerance provide information about the potential effects of acid deposition on amphibian communities. Amphibians as a group appear to be relatively acid tolerant, with many species suffering increased mortality only below pH 4. However, amphibians exhibit much intraspecific variation in acid tolerance, and some species are sensitive to even low levels of acidity. Furthermore, nonlethal effects, including depression of growth rates and increases in developmental abnormalities, can occur at higher pH.

  1. Sulfur tolerant anode materials

    SciTech Connect

    Not Available

    1988-05-01

    The goal of this program is the development of a molten carbonate fuel cell (MCFC) anode which is more tolerant of sulfur contaminants in the fuel than the current state-of-the-art nickel-based anode structures. This program addresses two different but related aspects of the sulfur contamination problem. The primary aspect is concerned with the development of a sulfur tolerant electrocatalyst for the fuel oxidation reaction. A secondary issue is the development of a sulfur tolerant water-gas-shift reaction catalyst and an investigation of potential steam reforming catalysts which also have some sulfur tolerant capabilities. These two aspects are being addressed as two separate tasks.

  2. Sulfur tolerant anode materials

    SciTech Connect

    Not Available

    1988-02-01

    The goal of this program is the development of a molten carbonate fuel cell (MCFC) anode which is more tolerant of sulfur contaminants in the fuel than the current state-of-the-art nickel-based anode structures. This program addresses two different but related aspects of the sulfur contamination problem. The primary aspect is concerned with the development of a sulfur tolerant electrocatalyst for the fuel oxidation reaction. A secondary issue is the development of a sulfur tolerant water-gas-shift reaction catalyst and an investigation of potential steam reforming catalysts which also have some sulfur tolerant capabilities. These two aspects are being addressed as two separate tasks.

  3. Sulfur tolerant anode materials

    SciTech Connect

    Not Available

    1987-02-01

    The goal of this program is the development of a molten carbonate fuel cell (MCFC) anode which is more tolerant of sulfur contaminants in the fuel than the current state-of-the-art nickel-based anode structures. This program addresses two different but related aspects of the sulfur contamination problem. The primary aspect is concerned with the development of a sulfur tolerant electrocatalyst for the fuel oxidation reaction. A secondary issue is the development of a sulfur tolerant water-gas-shift reaction catalyst and an investigation of potential steam reforming catalysts which also have some sulfur tolerant capabilities. These two aspects are being addressed as two separate tasks.

  4. Estimating ice encasement tolerance of herbage plants.

    PubMed

    Gudleifsson, Bjarni E; Bjarnadottir, Brynhildur

    2014-01-01

    One of the key stresses acting on herbage plants during winter is ice encasement, when plants are enclosed in compact ice and turn from aerobic to anaerobic respiration. The cause of cell death is related to the accumulation of metabolites to toxic levels during winter and perhaps also to production of reactive oxygen species (ROS) when plants escape from long-lasting ice cover. The process of ice encasement damage has been studied by sampling studies, indirect measurements of ice tolerance, field tests and provocation methods by increasing stress in the field artificially, thus increasing the ice stress. Here we describe a laboratory method to measure ice encasement tolerance. This is the most common and effective way to measure ice encasement tolerance of large plant material. Plants are raised from seeds (or taken from the field), cold acclimated, usually at +2 °C under short day conditions, in a greenhouse or growth chamber (or in the field during fall). Plants are submerged in cold water in beakers and frozen encased in ice, usually at -2 °C. Plants are kept enclosed in ice at this temperature. Samples are taken at intervals, depending on species and tolerance of plant material, and put smoothly to regrowth. Damage is then evaluated after a suitable time of regeneration.

  5. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power

  6. Probabilistic Flash Flood Forecasting using Stormscale Ensembles

    NASA Astrophysics Data System (ADS)

    Hardy, J.; Gourley, J. J.; Kain, J. S.; Clark, A.; Novak, D.; Hong, Y.

    2013-12-01

    Flash flooding is one of the most costly and deadly natural hazards in the US and across the globe. The loss of life and property from flash floods could be mitigated with better guidance from hydrological models, but these models have limitations. For example, they are commonly initialized using rainfall estimates derived from weather radars, but the time interval between observations of heavy rainfall and a flash flood can be on the order of minutes, particularly for small basins in urban settings. Increasing the lead time for these events is critical for protecting life and property. Therefore, this study advances the use of quantitative precipitation forecasts (QPFs) from a stormscale NWP ensemble system into a distributed hydrological model setting to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Rainfall error characteristics of the individual members are first diagnosed and quantified in terms of structure, amplitude, and location (SAL; Wernli et al., 2008). Amplitude and structure errors are readily correctable due to their diurnal nature, and the fine scales represented by the CAPS QPF members are consistent with radar-observed rainfall, mainly showing larger errors with afternoon convection. To account for the spatial uncertainty of the QPFs, we use an elliptic smoother, as in Marsh et al. (2012), to produce probabilistic QPFs (PQPFs). The elliptic smoother takes into consideration underdispersion, which is notoriously associated with stormscale ensembles, and thus, is good for targeting the approximate regions that may receive heavy rainfall. However, stormscale details contained in individual members are still needed to yield reasonable flash flood simulations. Therefore, on a case study basis, QPFs from individual members are then run through the hydrological model with their predicted structure and corrected amplitudes, but the locations of individual rainfall elements are perturbed within the PQPF elliptical regions using Monte

  7. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    SciTech Connect

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.

  8. Probabilistic objective functions for sensor management

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.; Zajic, Tim R.

    2004-08-01

    This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.

  9. A probabilistic analysis of silicon cost

    NASA Technical Reports Server (NTRS)

    Reiter, L. J.

    1983-01-01

    Silicon materials costs represent both a cost driver and an area where improvement can be made in the manufacture of photovoltaic modules. The cost from three processes for the production of low-cost silicon being developed under the U.S. Department of Energy's (DOE) National Photovoltaic Program is analyzed. The approach is based on probabilistic inputs and makes use of two models developed at the Jet Propulsion Laboratory: SIMRAND (SIMulation of Research ANd Development) and IPEG (Improved Price Estimating Guidelines). The approach, assumptions, and limitations are detailed along with a verification of the cost analyses methodology. Results, presented in the form of cumulative probability distributions for silicon cost, indicate that there is a 55% chance of reaching the DOE target of $16/kg for silicon material. This is a technically achievable cost based on expert forecasts of the results of ongoing research and development and do not imply any market prices for a given year.

  10. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  11. Social inequalities in probabilistic labor markets

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Chen, He

    2015-03-01

    We discuss social inequalities in labor markets for university graduates in Japan by using the Gini and k-indices . Feature vectors which specify the abilities of candidates (students) are built-into the probabilistic labor market model. Here we systematically examine what kind of selection processes (strategies) by companies according to the weighted feature vector of each candidate could induce what type of inequalities in the number of informal acceptances leading to a large mismatch between students and companies. This work was financially supported by Grant-in-Aid for Scientific Research (C) of Japan Society for the Promotion of Science (JSPS) No. 2533027803 and Grant-in-Aid for Scientific Research on Innovative Area No. 2512001313.

  12. Performing Probabilistic Risk Assessment Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  13. Self-insight in probabilistic category learning.

    PubMed

    Kemény, Ferenc; Lukács, Ágnes

    2013-01-01

    The Weather Prediction (WP) task is one of the most extensively used Probabilistic Category Learning tasks. Although it has been usually treated as an implicit task, its implicit nature has been questioned with focus on the structural knowledge of the acquired information. The goal of the current studies is to test if participants acquire explicit knowledge on the WP task. Experiment 1 addresses this question directly with the help of a subjective measure on self-insight in two groups: an experimental group facing the WP task and a control group with a task lacking predictive structure. Participants in the experimental group produced more explicit reports than the control group, and only on trials with explicit knowledge was their performance higher. Experiment 2 provided further evidence against the implicitness of the task by showing that decreasing stimulus presentation times extends the learning process, but does not result in more implicit processes.

  14. Probabilistic risk assessment of disassembly procedures

    SciTech Connect

    O`Brien, D.A.; Bement, T.R.; Letellier, B.C.

    1993-11-01

    The purpose of this report is to describe the use of Probabilistic Risk (Safety) Assessment (PRA or PSA) at a Department of Energy (DOE) facility. PRA is a methodology for (i) identifying combinations of events that, if they occur, lead to accidents (ii) estimating the frequency of occurrence of each combination of events and (iii) estimating the consequences of each accident. Specifically the study focused on evaluating the risks associated with dissembling a hazardous assembly. The PRA for the operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of establishing a risk-consequence goal for DOE operations.

  15. A Probabilistic Tsunami Hazard Assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D.; Kongko, W.; Cipta, A.; Koetapangwa, B.; Anugrah, S.; Thio, H. K.

    2012-12-01

    We present the first national probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment considers tsunami generated from near-field earthquakes sources around Indonesia as well as regional and far-field sources, to define the tsunami hazard at the coastline. The PTHA methodology is based on the established stochastic event-based approach to probabilistic seismic hazard assessment (PSHA) and has been adapted for tsunami. The earthquake source information is primarily based on the recent Indonesian National Seismic Hazard Map and included a consensus-workshop with Indonesia's leading tsunami and earthquake scientists to finalize the seismic source models and logic trees to include epistemic uncertainty. Results are presented in the form of tsunami hazard maps showing the expected tsunami height at the coast for a given return period, and also as tsunami probability maps, showing the probability of exceeding a tsunami height of 0.5m and 3.0m at the coast. These heights define the thresholds for different tsunami warning levels in the Indonesian Tsunami Early Warning System (Ina-TEWS). The results show that for short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, the islands of Nias and Mentawai. For longer return periods (>500 years), the tsunami hazard in Eastern Indonesia (north Papua, north Sulawesi) is nearly as high as that along the Sunda Arc. A sensitivity analysis of input parameters is conducted by sampling branches of the logic tree using a monte-carlo approach to constrain the relative importance of each input parameter. The results from this assessment can be used to underpin evidence-based decision making by disaster managers to prioritize tsunami mitigation, such as developing detailed inundation simulations for evacuation planning.

  16. Efficient Probabilistic Diagnostics for Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Chavira, Mark; Cascio, Keith; Poll, Scott; Darwiche, Adnan; Uckun, Serdar

    2008-01-01

    We consider in this work the probabilistic approach to model-based diagnosis when applied to electrical power systems (EPSs). Our probabilistic approach is formally well-founded, as it based on Bayesian networks and arithmetic circuits. We investigate the diagnostic task known as fault isolation, and pay special attention to meeting two of the main challenges . model development and real-time reasoning . often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-to-use speci.cation language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In essence, we introduce a high-level EPS speci.cation language from which Bayesian networks that can diagnose multiple simultaneous failures are auto-generated, and we illustrate the feasibility of using arithmetic circuits, compiled from Bayesian networks, for real-time diagnosis on real-world EPSs of interest to NASA. The experimental system is a real-world EPS, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. In experiments with the ADAPT Bayesian network, which currently contains 503 discrete nodes and 579 edges, we .nd high diagnostic accuracy in scenarios where one to three faults, both in components and sensors, were inserted. The time taken to compute the most probable explanation using arithmetic circuits has a small mean of 0.2625 milliseconds and standard deviation of 0.2028 milliseconds. In experiments with data from ADAPT we also show that arithmetic circuit evaluation substantially outperforms joint tree propagation and variable elimination, two alternative algorithms for diagnosis using Bayesian network inference.

  17. Probabilistic Analysis of Ground-Holding Strategies

    NASA Technical Reports Server (NTRS)

    Sheel, Minakshi

    1997-01-01

    The Ground-Holding Policy Problem (GHPP) has become a matter of great interest in recent years because of the high cost incurred by aircraft suffering from delays. Ground-holding keeps a flight on the ground at the departure airport if it is known it will be unable to land at the arrival airport. The GBPP is determining how many flights should be held on the ground before take-off and for how long, in order to minimize the cost of delays. When the uncertainty associated with airport landing capacity is considered, the GHPP becomes complicated. A decision support system that incorporates this uncertainty, solves the GHPP quickly, and gives good results would be of great help to air traffic management. The purpose of this thesis is to modify and analyze a probabilistic ground-holding algorithm by applying it to two common cases of capacity reduction. A graphical user interface was developed and sensitivity analysis was done on the algorithm, in order to see how it may be implemented in practice. The sensitivity analysis showed the algorithm was very sensitive to the number of probabilistic capacity scenarios used and to the cost ratio of air delay to ground delay. The algorithm was not particularly sensitive to the number of periods that the time horizon was divided into. In terms of cost savings, a ground-holding policy was the most beneficial when demand greatly exceeded airport capacity. When compared to other air traffic flow strategies, the ground-holding algorithm performed the best and was the most consistent under various situations. The algorithm can solve large problems quickly and efficiently on a personal computer.

  18. Entanglement and thermodynamics in general probabilistic theories

    NASA Astrophysics Data System (ADS)

    Chiribella, Giulio; Scandolo, Carlo Maria

    2015-10-01

    Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into another by means of local operations and classical communication. Under two operational requirements, we prove a general version of the Lo-Popescu theorem, which lies at the foundations of the theory of pure-state entanglement. We then consider a resource theory of purity where free operations are random reversible transformations, modelling the scenario where an agent has limited control over the dynamics of a closed system. Our key result is a duality between the resource theory of entanglement and the resource theory of purity, valid for every physical theory where all processes arise from pure states and reversible interactions at the fundamental level. As an application of the main result, we establish a one-to-one correspondence between entropies and measures of pure bipartite entanglement. The correspondence is then used to define entanglement measures in the general probabilistic framework. Finally, we show a duality between the task of information erasure and the task of entanglement generation, whereby the existence of entropy sinks (systems that can absorb arbitrary amounts of information) becomes equivalent to the existence of entanglement sources (correlated systems from which arbitrary amounts of entanglement can be extracted).

  19. Probabilistic properties of the Curve Number

    NASA Astrophysics Data System (ADS)

    Rutkowska, Agnieszka; Banasik, Kazimierz; Kohnova, Silvia; Karabova, Beata

    2013-04-01

    The determination of the Curve Number (CN) is fundamental for the hydrological rainfall-runoff SCS-CN method which assesses the runoff volume in small catchments. The CN depends on geomorphologic and physiographic properties of the catchment and traditionally it is assumed to be constant for each catchment. Many practitioners and researchers observe, however, that the parameter is characterized by a variability. This sometimes causes inconsistency in the river discharge prediction using the SCS-CN model. Hence probabilistic and statistical methods are advisable to investigate the CN as a random variable and to complement and improve the deterministic model. The results that will be presented contain determination of the probabilistic properties of the CNs for various Slovakian and Polish catchments using statistical methods. The detailed study concerns the description of empirical distributions (characteristics, QQ-plots and coefficients of goodness of fit, histograms), testing of the statistical hypotheses about some theoretical distributions (Kolmogorov-Smirnow, Anderson-Darling, Cramer-von Mises, χ2, Shapiro-Wilk), construction of confidence intervals and comparisons among catchments. The relationship between confidence intervals and the ARC soil classification will also be performed. The comparison between the border values of the confidence intervals and the ARC I and ARC III conditions is crucial for further modeling. The study of the response of the catchment to the stormy rainfall depth when the variability of the CN arises is also of special interest. ACKNOWLEDGMENTS The investigation described in the contribution has been initiated by first Author research visit to Technical University of Bratislava in 2012 within a STSM of the COST Action ES0901. Data used here have been provided by research project no. N N305 396238 founded by PL-Ministry of Science and Higher Education. The support provided by the organizations is gratefully acknowledged.

  20. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.